Skip to main content

One eye on the future

Mobileye’s Itay Gat discusses the evolution of monocular solutions for assisted and autonomous driving with Jason Barnes. Founded in 1999, Israeli company Mobileye manufactures and supplies advanced driver assistance systems (ADAS) based on its EyeQ family of systems-on-chips for image processing for solutions such as lane sensing, traffic sign recognition, vehicle and pedestrian detection. Its products are used by both the OEM and aftermarket sectors. The company’s visual interpretation algorithms drive
December 12, 2013 Read time: 7 mins
Warning Sign in the forest
Traffic sign recognition is an integral part of Mobileye’s system
Mobileye’s Itay Gat discusses the evolution of monocular solutions for assisted and autonomous driving with Jason Barnes

Founded in 1999, Israeli company 4279 Mobileye manufactures and supplies advanced driver assistance systems (ADAS) based on its EyeQ family of systems-on-chips for image processing for solutions such as lane sensing, traffic sign recognition, vehicle and pedestrian detection. Its products are used by both the OEM and aftermarket sectors.

The company’s visual interpretation algorithms drive customer applications ranging from the provision of timely alerts of potential collisions and lane/road departures, to active braking for accident prevention and mitigation. From day one, says Itay Gat its vice president of products, the company’s philosophy has been based on the use of monocular systems to provide affordable safety systems that will be installed in every car and save lives.

Mobileye’s monocular, vision-only approach is the natural solution, he says.

“Monocular vision can provide all functionality that is requires for safer driving. The theoretical proof can be seen in human vision’s ability to interpret the complex scene ahead when driving. There are other sensors types – radar for example – that can provide some of that information and in a very accurate manner. But none can provide the complete range required for the modern systems.

“That’s not to say that our systems can’t be combined with other technologies – they can – and nor does it mean that we don’t see value in stereo vision – we do. But people need to consider the application of stereo. Most of us think, because we have two eyes, that we use stereo vision all the time, whereas most of the time our eyes operate as two independent monocular sensors. We might use stereo vision when doing something close-up and intricate, such as threading a needle, but we’re not using stereo vision when we view something at distance. It’s a very good idea to combine radar, laser and stereo but only as an add-on to basic monocular – remember, it’s perfectly possible to drive with only one eye.

“By the time we get to the stage of autonomous driving we’re going to need to exploit all the information we can. We at Mobileye are not refusing to add further sensors – but only once we have exploited fully the capabilities of the monocular vision solution.”

Global player

Mobileye is a Tier 2 supplier in the OEM market but because of its major role in providing ADAS functionality together with the deep knowledge on other elements such as sensors and lens, Mobileye often works directly with OEMs, according to Gat.

In the OEM sector, the company’s customer list reads like a ‘Who’s who’ of the Tier 1 suppliers; Mobileye supplies most of the world’s major car manufacturers. It also provides aftermarket solutions which, Gat says, require a profound knowledge of the vision and processing systems as well as warning and mounting solutions. The company is also gaining significant traction in its domestic market through insurance companies.

By 2012, the company had shipped one million systems and expects to more than double that figure by the end of this year. However, that has to be set against an annual global figure for new vehicles sold of 60 million. Gat sees significant potential for vision-based systems to become standard fitment on an increasing proportion of new vehicles within a very short time which will increase numbers exponentially, he says: “From 2014, the only way that car manufacturers will be able to achieve a five-star 6437 Euro NCAP rating for their new products will be by incorporating active safety systems. That means the inclusion of autonomous emergency braking for the vehicle, something which will be extended to include pedestrians by 2016. Incorporation of active safety will shortly thereafter push down to a four-star rating, and the auto manufacturers are gearing up for that. It all means that in the not too distant future we’ll see a situation where 85-95% of those 60 million new vehicles sold each year will incorporate technologies of the type manufactured by companies such as Mobileye.”

Technological evolution

Mobileye is currently shipping its third-generation product and working on the follow-on. First-generation OEM and aftermarket systems were broadly analogous in performance terms and offered applications such as lane departure warning, high-beam lighting control and traffic sign recognition. Second-generation systems added vehicle detection applications such as headway monitoring and, most noticeably, forward collision warning.

“They were an eye-opener for many people because it had been assumed that radar was needed for many of the applications we now provide with vision,” Gat says. “The development and marketing of such active safety systems also opened the way for the development of testing procedures by organisations such as NHTSA in the US. It also helped to precipitate the moves by organisations such as Euro NCAP to encourage take-up of third-generation systems incorporating active autonomous braking, adaptive cruise control and so on.”

Third-generation systems, with their more active stance, introduce an element of separation in performance terms between OEM and aftermarket products but Gat feels that it should be possible to bring some of the OEM-standard functionalities into the retrofit systems. Specific domains – fleets, trucks and buses – are asking for this, he notes, but as OEM fitment becomes more prevalent there will be an inevitable tail-off in aftermarket demand.

The follow-on fourth generation will be the first which will offer autonomous driving capabilities and Gat says that the technology is there to provide solutions within a year. Like others, he sees less demanding environments such as inter-urban driving as the routes to market, with the more complex following later.

Vision system developments

When it comes to the evolution of the vision technology itself, Gat notes that while first-generation systems were sufficient in terms of the width of the video graphics arrays which they used, they struggled to provide sufficient resolution in conditions of low light. Since then, new products have arrived with higher dynamic ranges and better signal:noise ratios. Work has also gone on to reduce pixel size and there has been a “clear, continuous process of improvement”. Systems on the market from next year will offer a four to five times order of magnitude improvement over those first-generation solutions in terms of dynamic range and extended gain, and by the end of 2015/16 period, further substantial improvement can be expected, Gat predicts.

Another trend is more related to aspect ratio and the need to be able to address a wide variety of applications and situations. This drives a need for higher resolutions and wider fields of view. Especially where fourth-generation solutions are concerned, Gat sees a range of three sensors being needed to span all of the possible applications: a system with a regular (45-52o) field of view; a second with a narrow (22-34o) field of view; and another which offers a very wide, fisheye-type field of view.

“Addressing all possible applications of a vision-based solution with just the one solution simply won’t be possible,” Gat continues. “It will be possible to have these solutions installed together and to an extent you could regard them as being ‘stereo’ solutions but they’re not really. The space needed for a windscreen-mounted solution is really quite small – only a 6-7cm space – but you would gain redundancy and a good field of view. Some might question whether that’s separation enough to allow the performance we’ll need but again I’ll refer to the human eye and how close they are together on the face.”

Such solutions will offer zero false positives or false negatives. This is essential for an autonomous driving environment which uses vision as a primary sensor – it is unacceptable having a self-guided vehicle continually braking for no good reason, for instance. By the 2020/25 timeframe, Gat sees autonomous driving as not just possible but commonplace. But, he notes, in the years between now and then the market needs to be shown solutions and acceptance has to be gained.

For more information on companies in this article

Related Content

boombox1
boombox2