Samsung today rolled out its new Exynos Auto V9 SoC, announcing that the automotive-focused SoC will power Audi’s in-vehicle infotainment (IVI) system that will debut by 2021. Samsung’s first SoC for automobiles complies with Automotive Safety Integrity Level (ASIL)-B requirements, it integrates ARM’s latest CPU and GPU technology as well as supports multiple screens and cameras.

The Samsung Exynos Auto V9 packs eight Arm Cortex-A76 cores running at 2.1 GHz, but the company does not disclose whether it uses Cortex-A76AE designed specifically for automotive applications or just regular ones. On the graphics side of things, the SoC integrates three dedicated sets of Arm Mali G76 GPUs (i.e., that can work completely separately) to drive cluster display, central information display (CID) and rear-seat entertainment (RSE) displays simultaneously. In addition, the processor features a (presumably custom) neural network processing unit (NPU) to process visual and audio data for face, speech, and gesture recognition. Besides, the Exynos Auto V9 four HiFi 4 DSPs for audio, and a safety island core to protect system operations in real time that supports ASIL-B standards. The chip can work with current-gen LPDDR4 as well as upcoming LPDDR5 types of memory.

Samsung’s first SoC for in-vehicle infotainment applications supports up to six monitors and 12 camera connections, which should be enough for advanced autopilot capabilities (just to put it into context: Tesla's AutoPilot 2.0 only uses one camera for autopilot right now), though since the latter will be implemented by Audi, Samsung does not make any comments on the matter.

Samsung will use its 8LPP process technology to manufacture the chip. Meanwhile, Audi will use the Exynos Auto V9 for its IVI system that is set to debut by 2021. In general, expect the SoC to power vehicles that will arrive in 2020 and later.

At present Samsung already supplies Audi with its OLED displays, so the new agreement is a natural extension of the partnership between the two companies.

Related Reading:

Source: Samsung

POST A COMMENT

17 Comments

View All Comments

  • Robert Pankiw - Thursday, January 3, 2019 - link

    "[J]ust to put it into context: Tesla's AutoPilot 2.0 only uses one camera for autopilot right now"
    Included in the above link are screenshots from Tesla's configuration tool. Original Autopilot had 1 camera, whereas Autopilot 2.0 (or Enhanced Autopilot) has 4 cameras in addition to ultrasound. Tesla's can go up to 8 cameras (which are enabled by a software update).
    Currently Autopilot 2.1 / 2.5 appears to simply be more compute power, and not more sensors.
    Reply
  • Robert Pankiw - Thursday, January 3, 2019 - link

    Spam filter prevented my link from being included.

    It was electrek co/2016/10/19/ tesla-fully-autonomous-self-driving-car/
    Reply
  • name99 - Thursday, January 3, 2019 - link

    Yeah, I've no idea what that "one camera on Tesla" claim is supposed to represent or in what manner it is "technically" accurate.

    Here's what Tesla actually says:
    "Eight surround cameras provide 360 degrees of visibility around the car at up to 250 meters of range. " (plus 12 ultrasonic sensors and forward radar)

    https://www.tesla.com/autopilot
    Reply
  • firewolfsm - Thursday, January 3, 2019 - link

    Correction w/r/t Tesla: They do in fact use all the cameras at this point, not just one. Reply
  • mode_13h - Thursday, January 3, 2019 - link

    Man, 8-core is so passe. These days, the automotive sector is all about turbo-boosting smaller 6-core engines... er, I mean chips. Reply
  • mode_13h - Thursday, January 3, 2019 - link

    Anyway, with 8 cores and 3 GPUs, it'd better have its own radiator. Especially when the kids start crytpomining on the RSE GPU. Reply
  • danwat1234 - Tuesday, January 8, 2019 - link

    "(just to put it into context: Tesla's AutoPilot 2.0 only uses one camera for autopilot right now)" .. No, Tesla uses 8 cameras for Enhanced Autopilot. Reply

Log in

Don't have an account? Sign up now