The new Future Standards Forum will homogenise standards develop in the machine vision and partnering sectors. Here, machine vision industry experts discuss developments. By Jason Barnes At the Vision Show, which took place in Stuttgart at the beginning of November, the European Machine Vision Association, the US’s Automated Imaging Association and the Japan Industrial Imaging Association (JIIA) established a joint initiative, the Future Standards Forum (FSF). This, said the EMVA’s President Toni Ventura, a
The new Future Standards Forum will homogenise standards develop in the machine vision and partnering sectors. Here, machine vision industry experts discuss developments. By Jason Barnes
At theSignificantly, the FSF’s remit recognises that application specifics should play a large role in future systems’ development. That raises the possibility of camera manufacturers working in a closer, more structured fashion with ITS manufacturers and users. It also raises the issue of which are the most relevant to the transport sector. Interface standards such as Camera Link and CoaXPress are now considered mature but relative newcomers such as GigE Vision – as industry experts such as
Obligation and emphasis
Steve Hearn,Many, says Hearn, see standards as an exercise in dumbing down, in that they force conformity and reduce differentiation. But this needn’t be the case, he continues.
“I can see that for the majority of people having a standard is useful. The risk comes from standards which are loosely defined and therefore open to interpretation. You end up with standards within standards. People also have to appreciate different standards’ levels. FireWire DCAM, GigE Vision, USB 3 Vision and so on are transport layer standards but these can feed into GenICam. In some respects I think many people would be better served by working to GenICam rather than the transport layer standards.”
A positive development is that, overall, the standards definition committees are getting much faster and more effective: “Everyone sees the benefits of shorter times to market. That’s necessary because the pace of development is accelerating. We didn’t have that many sensor manufacturers even just a few years ago but that’s changed with CMOS sensors having become worthwhile.”
Essential evolution
The absence of a single player big enough to force de facto standards on the machine vision industry makes cooperation all the more essential, says Joost van Kuijk,“Fewer standards would perhaps be better. We need worldwide standards and it’s useful that moves have been made to bring together the definition efforts in Europe, Asia and North America. I don’t see too much duplication in the standards we have at present – each has its own space as applications get more complex. There are common trends, though. Data rates are rising whilst trigger rates are getting faster. We have now all that we need to go from the apps at the lowest end up to 10GP/s, where in reality there are so few real applications that solutions can probably get away with being bespoke.”
Timing and platforms
Arlin Kalenchuk, Product Manager with“We’re seeing more and more suppliers offering this. It allows software triggering of multiple cameras over GigE Vision, where once a hardware trigger was needed. With PTP, all of a network’s cameras’ clocks can be synchronised and then, once synchronised, they can be triggered all at the same time. You can also synchronise the clocks to GPS.
“That means you can associate GPS time with an image and you can provide GPS time on a camera which doesn’t have a GPS clock installed. That’s a big potential cost saving.”
Another healthy development is that camera manufacturers are also beginning to offer operating systems besides Windows.
“Linux is becoming better supported as a platform for software. At present, Allied Vision is the only manufacturer offering a QNX driver, a real-time operating system which allows you to be very deterministic when it comes to software triggering, and a nice competitive advantage for us which links back to real-world timing and the ability to support time-critical applications.”
Reinforcing choice
A greater emphasis on standards definition is helping to reinforce the idea that, far from being the expensive solution, machine vision when viewed in sum is often the most cost-effective transport management choice, says Stéphane Clauss, Business Development Manager Image Sensing Solutions,“In the short term at least, no one standard is going to cover all needs. USB 3 Vision has a lot to recommend it, both in terms of the standard itself and the bandwidths it offers, but cable lengths are an issue. If you need longer cable lengths, then either GigE Vision or even analogue is the solution. However, if you want all of the functionality in terms of processing embedded at the camera head, we still believe in Camera Link; it allows greater optimisation and system designs which don’t need additional software on the PC side. Higher levels of at-the-edge processing increase efficiencies and reduce software development times. All of that better enables systems suppliers to customise their offerings and so is especially useful in the transport management sector.”
Live and let die
Although Clauss expresses continued support for Camera Link,“I don’t see Firewire being around for too much longer. I think Camera Link’s got another decade at best,” he states.
Older standards such as Camera Link require a frame grabber – a piece of software which takes images and reconstructs them on the PC. Newer standards such as GigE Vision and USB3 Vision don’t, as image construction takes place at the camera, something which is going to be a big factor going forward, he feels: “GigE Vision competes and will stay. It’s effective, works over long cable distances, has a standard hardware interface in the form of Ethernet and offers a natural migration path to 10GB capacity. USB 3 Vision will probably hurt Camera Link the most as it plays in the same bandwidth range. Systems can be had now for less than $1,000 apiece.
“In truth, many of the other emerging standards aren’t too applicable to traffic applications right now; especially where the primary driver is demand for extremely large bandwidth,” he continues. “Principally, these are for applications such as flat screen TV manufacture, where Teledyne DALSA supplies the imaging components that inspect the majority of flat screens worldwide. The good news is that the performance capabilities will be there for transport applications when they need them.”
Lighting standards – a way to go
Although the standards definition effort is gaining pace in other areas of machine vision,“At the moment no-one is buying from the datasheet, only from empirical testing. The scientific approach doesn’t work because there’s no theoretical model for the reflectivity of parts of the vehicle and the contrast requirements of the image processing.
“There’s perhaps a case for starting with a standard for how outputs are expressed. We used to express our capabilities in terms of a scientific value of light output power. However many in the security sector were expressing capability in terms of electrical input power to LEDs. We had a light which was far brighter but comparing our output power to another’s input power grossly distorted the relative brightness of the products. I’m not saying that you won’t be able to buy an off-the-shelf product. I’m sure that will come in time but for now people are far better served by looking at practical matters – for example, ‘Will this solution work in this location, with this licence plate’s format?’... ‘Do we need multiple images of plate, car colour and vehicle occupancy?’”
Brightness is always an issue, he notes, although with greater take-up of machine vision technology in regions such as the Middle East the ability to work reliably across a much greater temperature range is increasingly important.
“Lots of people come to us asking to use cheaper cameras. For that, they need DC lighting solutions, which typically have 20 per cent of the brightness of a strobe light with the same LEDs. They fail to realise that for just a small cost increase they can have a camera which triggers or synchs with a strobe, allowing a much higher performance strobe light to be used.
“Lighting isn’t just a product of how systems such as ours work. In many instances you need to strobe; there are health and safety limits which prevent a high-brightness IR light source being on constantly, for instance. And you can’t flash randomly with white light – it’s startling to drivers so you need to adjust strobing to provide bright, white light only when needed.”
Gardasoft, he says, is currently working on a new white light source which will be suitable for vehicle identification and colour, and may also prove to be useful for driver identification.
At the Vision show in Stuttgart in early November, Stemmer Imaging’s Director – Corporate Market Development, Mark Williamson, gave a 30-minute presentation on the interrelation of machine vision standards, how they interact and how to prepare for them. This presentation can be seen here: