Wednesday, March 04, 2015

Mobileye Unveils its 4th Gen Vision Processor

PRNewswire: Mobileye introduces its 4th generation system-on-chip, the EyeQ4, consisting of 14 computing cores out of which 10 are specialized vector accelerators for visual processing and understanding. The first design win for EyeQ4 has been secured for a global premium European car manufacturer for production to start in early 2018. The EyeQ4 would be part of a scalable camera system starting from monocular processing for collision avoidance applications, in compliance with EU NCAP, US NHSTA and other regulatory requirements, up to trifocal camera configuration supporting high-end customer functions including semi-autonomous driving. The EyeQ4 would support fusion with radars and scanning-beam lasers in the high-end customer functions.

"Supporting a camera centric approach for autonomous driving is essential as the camera provides the richest source of information at the lowest cost package. To reach affordable high-end functionality for autonomous driving requires a computing infrastructure capable of processing many cameras simultaneously while extracting from each camera high-level meaning such as location of multiple types of objects, lanes and drivable path information," said Amnon Shashua, cofounder, CTO and Chairman of Mobileye. "The EyeQ4 continues a legacy that began in 2004 with EyeQ1 where we leveraged deep understanding of computer vision processing to come up with highly optimized architectures to support extremely intensive computations at automotive compliant power consumption of 2-3 Watts."

The EyeQ4 provides "super-computer" capabilities of more than 2.5 teraflops within a low-power (approximately 3W) automotive grade system-on-chip.

EyeQ4-based ADAS uses computer vision algorithms like Deep Layered Networks and Graphical Models while processing information from 8 cameras simultaneously at 36fps. The EyeQ4 will accept multiple camera inputs from a trifocal front-sensing camera configuration, surround-view-systems of four wide field of view cameras, a long range rear-facing camera and information from multiple radars and scanning beam lasers scanners. Taken together, the EyeQ4 will be processing a safety "cocoon" around the vehicle – essential for autonomous driving.

Engineering samples of EyeQ4 are expected to be available by Q4 2015. First test hardware with the full suite of applications including active safety suite of customer functions, environmental modeling (for each of the 8 cameras), path planning for hands-free driving and fusion with sensors, is expected to be available in Q2 2016.


Thanks to MM for the link!

2 comments:

  1. Looks great.
    Do you know who is the foundry of this large SoC?

    Cheers Robert.

    ReplyDelete

All comments are moderated to avoid spam and personal attacks.