Troubleshooting Image Consistency Errors
Image consistency errors have a variety of causes, and the user may have to address more than one cause to correct the errors. This application note describes the most common causes of image consistency errors and suggests ways to correct them. For information about these errors occurring with GigE cameras on Linux, see KB10016 Lost Ethernet data packets on Linux systems.
Before you use your camera, we recommend that you are aware of the following resources available from the our downloads page:
Getting Started Manual for the camera—provides information on installing components and software needed to run the camera.
Technical Reference for the camera—provides information on the camera’s specifications, features and operations, as well as imaging and acquisition controls.
Firmware updates—ensure you are using the most up-to-date firmware for the camera to take advantage of improvements and fixes.
Tech Insights—Subscribe to our monthly email updates containing information on new knowledge base articles, new firmware and software releases, and Product Change Notices (PCN).
Image consistency errors refer to a range of messages sent by the driver when an image is missing packets or when the camera is otherwise unsuccessful at transmitting image data to the CPU.
FlyCapture reports image consistency errors in the GUI Event Statistics Window as shown here. The viewer does not see images with errors, such as torn images, because FlyCapture discards torn images. Other software packages, such as NI-MAX, display torn images when packets are missing.
Image consistency errors have a variety of causes, and the user may have to address more than one cause to correct the errors. This table is a summary of the most common causes of errors and their possible solutions. Additional details of possible solutions are given in later sections.
|Cause of Error||Possible Solutions|
|The GigE performance driver (image filter driver) is not installed.||Install filter driver or update driver.|
|The jumbo packet is not enabled (or is not supported) by the network adaptor.||
Ensure the driver is up-to-date.
Enable Interruption mode.
Enable the jumbo packet option.
|Packet resend is not turned on.||Turn on packet resend option.|
|Packet size and delay are not tuned.||Optimize the packet size and delay settings to manage the bandwidth allocation.|
|Too many dropped packets for packet resend to manage.||
Check the compatibilities of the hardware. Try a different Ethernet cable or network adapter. Make sure the adapter is capable of handling the bandwidth.
If you are using a multi-camera set-up, you may have to lower the bandwidth being transferred to the adapter.
|DPC latency rate is too high.||Review the number of tasks being performed and the overall demand on the CPU.|
|Interrupt Moderation Rate is not enabled, or has an incorrect setting.||Turn on the Interrupt Moderation Rate or adjust it according to your configuration and data rate.|
|Number of receive buffers is set too low.||Increase the value for receiver buffers.|
|Lost data packets when streaming in Linux.||Increase packet delay time or increase the amount of memory for receive buffers.|
The Windows network copies the network packets from kernel mode to user mode and the user must then make another copy of the image data. Installing an image filter driver eliminates the need for a second copy by because the driver copies the image data at the kernel level. This reduces the workload on the CPU and lessens the chance of image consistency errors.
To Install FlyCapture’s filter driver, first install FlyCapture, then enable the appropriate drivers.
To install FlyCapture:
To enable the filter driver:
The use of jumbo packet (or jumbo frames) enabled network adapter cards results in fewer image consistency errors. Larger packets result in less overhead on the host CPU. Typically, network drivers split data larger than 1500 bytes into multiple packets. However, the GigE Vision standard allows packet sizes of up to 9014 bytes. These large packets, also called jumbo packets, allow the camera to more efficiently transfer data across the network.
Using jumbo packets results in fewer packets per frame of image, which means there are fewer interruptions for the host and the host is less likely to drop data, resulting in image consistency errors.
FlyCapture does not prompt the user to enable Jumbo packet but some third-party software, such as NI-MAX, sends an error message prompting the user to enable jumbo packets when errors occur that may be a result of this option being disabled.
To enable jumbo packets:
Packet data is copied into the image buffer at the kernel level. Once all of the image data is received, the driver passes back the buffer to user mode. When the driver detects that a packet has arrived out of sequence (based on the packet number), following packets are placed in kernel memory until the missing packet arrives. If the missing packet does not arrive within a user-defined time, the driver transmits a resend request for that packet. The driver transfers the packets from kernel memory to user memory when all missing packets have arrived.
Packet resend does not prevent all image consistency errors. If the host is overwhelmed with the speed of the arriving data or if the transmission is so bad that most packets are being lost, then the packet resend may not be able to prevent all of the errors. If multiple resends are issued and the packets are consistently lost, then torn images or image consistency errors may still occur. When there are too many packets to resend, an image consistency error occurs. Sometimes the driver may detect more than one packet gap in the image data and drop the image, as the camera (firmware) does not attempt more than one resend request per image.
When packet resend is turned on, the resend requested and received count increases. Some third-party software, such as NI-MAX, include the packet resend option.
There is no specific formula to calculate the optimum number of frames per second (FPS) for a particular packet size/delay setting, but a greater delay between sending packets means less demand on the host/driver. Set the packet size to the maximum (9 k with jumbo packets enabled) and choose the maximum delay time that still allows the required frame rate without errors. Adjust the packet size and delay time as required to minimize errors or dropped packets.
Incompatible hardware may cause too many dropped packets for packet resend to fix the error. Try a different Ethernet cable or network adapter. Check the bandwidth capabilities of the network adapter. A typical network adapter handles 1 Gb/s but some older adapters can only handle 100 Mb/s (10x slower). If the adapter can’t handle the amount of image data being transferred, the result is a large number of dropped packets.
A multi-camera set-up connected through a switch to a single network port may transfer more data than the network can process, resulting in dropped packets. To lower the bandwidth transferred to the network adapter, lower the frame rate or the image resolution on one or more cameras. Make sure that you are sending less than the maximum amount of data for your network adapter (see above).
A DPC is a process that defers tasks with a low priority in order to immediately process high-priority tasks. A high DPC latency indicates that that system is experiencing many delays as a result of high bandwidth demand and image packet data is not being handled at a rate fast enough to avoid dropped packets and image consistency errors.
When the PC performs side tasks such as opening up an image stream or playing media while the camera is running on the system, these side tasks can generate a number of DPCs. If the DPCs all run on the same core and at the same priority as the image data (grabbing and streaming), it causes problems trying to handle all the interrupts at the same time. Although the DPCs may not be running on the same core, it is likely that any DPC created by a network driver is running on the same core. DPCs created elsewhere (USB, graphics) are likely to run elsewhere on a multi-core system.
This screen capture shows a high DPC latency while a media file is being played on YouTube. Note the spike in CPU usage and the corresponding errors. As the CPU usage stabilizes, the image consistency errors disappear, even though the media is still playing. This shows how the initial creation of DPCs affects the rate of image grabbing from the camera. As the DPCs are handled, the image grabbing rate recovers, even while the video continues to play. To reduce the latency, do not run too many tasks simultaneously. Resources diverted from the camera to other tasks reduces the amount of resources available for image capture.
In addition, latency issues may be caused by drivers other than network drivers, such as graphics drivers, wi-fi or usb drivers. For best performance, update all adapter drivers and ensure they are kept up-to-date.
This solution is suggested for advanced users.
Interrupt moderation allows the adapter to moderate the interrupts. When a packet arrives, the adapter generates an interrupt, which allows the driver to handle the packet. At greater linkspeeds, more interrupts are created, increasing the demand on the CPU. Too many interrupts result in poor system performance. When you enable Interrupt Moderation, the interrupt rate is lower, and the system performance improves.
Interrupt Moderation Rate sets the rate at which the controller moderates or delays the generation of interrupts, optimizing network throughput and CPU use. The Adaptive setting adjusts the interrupt rates dynamically, depending on traffic type and network usage. Choosing a different setting may improve network and system performance in certain configurations. If the moderation rate is high, the amount of interrupts will be suppressed. The less often the interrupt happens, the lower the CPU load. However, the less often the interrupt happens, the more likely the ACK respond from the host will be slow, resulting in the host responding to a request or accepting an image packet more slowly. Adjust the moderation rate based on the system, camera and data rate of your current configuration.
This solution is suggested for advanced users.
Receive buffers are used by the adaptor when copying image data to the system’s memory. Increasing the number of the buffers can enhance receive performance, but also uses more memory. Set the number of the receive buffers based on the camera configuration.