🚀 Open Source Hypervisor L4Re - Empowering the Future of Automotive at NXP Tech Days It will only be a few more days until our CTO Adam Lackorzynski together with Senior Software Engineers Jan Klötzke and Marcus Haehnel will be attending the NXP Tech Days in Detroit on October 30th and 31st! They will present a powerful demonstrator, featuring our open-source L4Re Micro Hypervisor running on the NXP S32Z with multiple VMs – among them Autoware running inside a Zephyr and controlling steering and acceleration of an autonomous driving car. The demonstration integrates a blueprint from SOAFEE: The Scalable Open Architecture for Embedded Edge is building a standardized architecture for Software Defined Vehicles (SDVs), where vehicle functions are predominantly implemented by software. SOAFEE’s cloud-native platform approach enables development in the cloud and deployment on edge devices such as vehicle’s high-performance computing systems. Kernkonzept, as a SOAFEE member and Arm partner, showcases its L4Re Hypervisor framework together with the SOAFEE reference platform EWAOL, demonstrating how virtual machines can be seamlessly operated across different platform software stack. This allows flexible system designs for various applications, including navigation and safety-critical functions. The SOAFEE architecture enables safety-critical and non-safety-critical applications to run securely and efficiently on a single platform, meeting real-time requirements and adhering to safety standards such as ISO 26262. Adam Lackorzynski, heading the SOAFEE Microkernel Working Group as well as participating in the Hypervisor WG, will explain how the consolidation of Arm’s Cortex-A and Cortex-R CPUs allows partners to flexibly design systems for various automotive use cases in software-defined vehicles, including navigation and safety-critical functions. The SOAFEE Blueprints are reference setups that focus on specific use cases for Software Defined Vehicles. These blueprints validate the concepts and principles of the SOAFEE architecture in real-world scenarios. They provide examples of how applications can be developed in the cloud and deployed at the edge, such as in vehicles, without requiring the full DevOps cycle. 🔥 We can’t wait to be part of this event. Will you be in Detroit too? Do stop by our booth for live demonstrations and discussions on the future of open-source in automotive! #nxptechdays2024
Kernkonzept GmbH’s Post
More Relevant Posts
-
🚗💻 AUTOSAR vs QNX: Shaping the Future of Automotive Software Architecture As we race towards software-defined vehicles, the choice of underlying architecture becomes crucial. Let's explore how AUTOSAR and QNX are driving this transformation: 🔧 AUTOSAR: Standardization & Flexibility ----------------------------------------- • Interoperability: Seamless integration across suppliers • Scalability: Adaptive AUTOSAR for high-performance computing • Industry-wide adoption: Backed by major OEMs 🛡️ QNX: Real-time Performance & Security ---------------------------------------- • Deterministic performance: Critical for ADAS & autonomous driving • Robust security: Built-in features for connected vehicles • Proven track record: Widely deployed in critical systems 🤝 The Convergence ------------------ We're seeing a trend towards coexistence: 1. AUTOSAR: Standardized application interfaces 2. QNX: Underlying RTOS for safety-critical functions This synergy accelerates innovation in: • Functional safety (ISO 26262 compliance) • Over-the-air (OTA) updates • Advanced HMI and digital cockpits 🔮 Looking Ahead ---------------- The future may bring: • Increased AUTOSAR-QNX integration • AUTOSAR evolution for emerging tech (AI/ML) • Greater focus on cybersecurity standards The question isn't "AUTOSAR or QNX?" but rather how to leverage their combined strengths for safer, more connected, and autonomous vehicles. 💬 What's your take? How do you see AUTOSAR and QNX shaping tomorrow's vehicles? Share your thoughts below! #AutomotiveTechnology #SoftwareDefinedVehicles #AUTOSAR #QNX #ConnectedCars
To view or add a comment, sign in
-
𝐔𝐧𝐥𝐞𝐚𝐬𝐡𝐢𝐧𝐠 𝐭𝐡𝐞 𝐏𝐨𝐰𝐞𝐫 𝐨𝐟 𝐀𝐝𝐚𝐩𝐭𝐢𝐯𝐞 𝐀𝐔𝐓𝐎𝐒𝐀𝐑: 𝐀 𝐆𝐚𝐦𝐞 𝐂𝐡𝐚𝐧𝐠𝐞𝐫 𝐢𝐧 𝐌𝐨𝐝𝐞𝐫𝐧 𝐕𝐞𝐡𝐢𝐜𝐥𝐞 𝐀𝐫𝐜𝐡𝐢𝐭𝐞𝐜𝐭𝐮𝐫𝐞 🚘 As an experienced developer deeply involved with Adaptive AUTOSAR, I am thrilled to share why this technology revolutionizes the automotive industry. Adaptive AUTOSAR stands out as a flexible, modular, and scalable architecture specifically designed for modern vehicles equipped with High-Performance Controllers (HPC). At its core, Adaptive AUTOSAR utilizes a Service Oriented Architecture (SOA) and the power of C++14, ensuring robustness and efficiency. 𝐖𝐡𝐲 𝐀𝐝𝐚𝐩𝐭𝐢𝐯𝐞 𝐀𝐔𝐓𝐎𝐒𝐀𝐑 𝐢𝐬 𝐓𝐫𝐚𝐧𝐬𝐟𝐨𝐫𝐦𝐚𝐭𝐢𝐯𝐞 𝐌𝐞𝐞𝐭𝐢𝐧𝐠 𝐄𝐯𝐨𝐥𝐯𝐢𝐧𝐠 𝐈𝐧𝐝𝐮𝐬𝐭𝐫𝐲 𝐍𝐞𝐞𝐝𝐬 🏭 The automotive industry is rapidly evolving, with increasing emphasis on electrification, advanced driver assistance systems (ADAS), autonomous driving, Vehicle-to-Everything (V2X) communication, and Over-the-Air (OTA) updates. Adaptive AUTOSAR plays a pivotal role in addressing these needs by providing a solid foundation for developing complex functionalities required in modern vehicles. 𝐒𝐭𝐫𝐞𝐚𝐦𝐥𝐢𝐧𝐢𝐧𝐠 𝐃𝐞𝐯𝐞𝐥𝐨𝐩𝐦𝐞𝐧𝐭 🧠 One of the most significant advantages of Adaptive AUTOSAR is that it allows developers to focus on creating innovative applications rather than being bogged down by concerns over stack architecture, robustness, security, and maintenance. The architecture simplifies compliance with certification and governmental standards, making the development process more efficient and less stressful. 𝐅𝐚𝐜𝐢𝐥𝐢𝐭𝐚𝐭𝐢𝐧𝐠 𝐒𝐞𝐧𝐬𝐨𝐫 𝐅𝐮𝐬𝐢𝐨𝐧 📸 In the realm of modern vehicle development, sensor fusion is crucial. Adaptive AUTOSAR makes it remarkably easy to integrate data from various sensors like radar, cameras, and Lidar. The platform itself offers demo applications that showcase how to build sensor fusion applications, making it an invaluable resource for developers. 𝐓𝐡𝐞 𝐅𝐮𝐭𝐮𝐫𝐞 𝐢𝐬 𝐒𝐨𝐟𝐭𝐰𝐚𝐫𝐞-𝐃𝐞𝐟𝐢𝐧𝐞𝐝 ⏳ As software continues to be the cornerstone of modern vehicles, Adaptive AUTOSAR emerges as a strong foundation for future vehicle architectures. Its adaptability and scalability ensure that it can meet the demands of today and tomorrow. By leveraging Adaptive AUTOSAR, we are not just keeping up with the industry’s evolution, we are driving it. 𝐉𝐨𝐢𝐧 𝐭𝐡𝐞 𝐂𝐨𝐧𝐯𝐞𝐫𝐬𝐚𝐭𝐢𝐨𝐧 📝 Adaptive AUTOSAR is more than just a technology, it's a movement. I invite you to share your thoughts and experiences with Adaptive AUTOSAR in the comments below. Let’s shape the future of automotive technology together! #SoftwareDefinedVehicles #AdaptiveAUTOSAR #AutomotiveSoftware #EmbeddedSystems #SoftwareDevelopment #AutomotiveTech #development #adaptiveautosar #classicautosar #autosar #AUTOSAR #AUTOSARenthusiasts #Adaptive #Rust #Safety #Standardization #C
To view or add a comment, sign in
-
Alibaba and Nvidia are collaborating to advance autonomous driving technologies in China through an AI initiative that integrates Alibaba's large language models with Nvidia's automotive platforms. Summary: ~ Alibaba Cloud, a major player within the Alibaba Group, has partnered with Nvidia to develop AI-powered solutions for the automotive industry. ~ The collaboration led to the unveiling of a large multimodal model (LMM) solution at the Apsara Conference. ~ This LMM solution is co-developed with Nvidia and Banma Network Technology, focusing on improving the autonomous driving experience. ~ Alibaba Cloud's Qwen portfolio of AI models is integrated with Nvidia's Drive AGX Orin platform for autonomous vehicles. ~ The integration intends to enhance the capabilities of Chinese EV manufacturers like Li Auto, Great Wall Motors, Geely Auto, and Xiaomi. ~ Nvidia's Drive AGX Orin platform is a key component, improving computational efficiency and reducing latency for real-time processing in automated driving systems. ~ The new AI solution empowers in-car voice assistants, allowing dynamic conversations and proactive recommendations. ~ It supports tasks like executing voice commands for various services, such as ordering through food apps, providing a richer user experience. Source: https://lnkd.in/gs7XVTBN
To view or add a comment, sign in
-
The Motley Fool - "Why Are NVIDIA Design and Visualization and Uber Backing This Tiny $400 Million Artificial Intelligence (AI) Company?" Serve Robotics is an approx. $579M Nasdaq stock, a #microcap stock. Ali Kashani (https://lnkd.in/geF7keRz), Canada educated, Los Altos, CA-living, is the co-founder and CEO of the Uber and Nvidia-supported microcap. Mark Tompkins of Montrose Capital Partners (https://lnkd.in/g8sU6Q8E) is another big investor. "Uber is partnered with 14 different companies developing autonomous driving platforms as it looks forward to a shift away from human drivers in the mobility industry. Technologies like autonomous driving and robotics are powered by AI, and Nvidia has also developed its own autonomous driving platform... In a presentation last month, Serve Robotics posed this thoughtful question: Why move two-pound burritos in two-ton cars?"... Since early 2022, the company's robots have delivered more than 50,000 orders on behalf of more than 400 restaurants across Los Angeles, and those deliveries were made with up to 99.94% reliability. That made the robots 10 times more reliable than human drivers, according to Serve." HERE'S THE CATCH: The company is IN THE RED, and only has approx. $50M cash on hand. If they raise an additional $100M by selling more shares, it will dilute Uber and Nvidia. So, there are a number of warnings if Uber and Nvidia are not willing to gamble on Serve any more. SERVE (Nasdaq: SERV) is down 47.68 percent YTD, since its IPO immediately plummeted. Then in the past 6 months, SERV has risen 531.88 percent (66.62 percent in the past month). Per Investing.com - "Serve Robotics director Jordan James Buckly sells $454,581 in stock." Buckly has been selling the past few weeks multiple times. "Serve robotics president Touraj Parang sells $669,786 in stock." "Serve Robotics CEO Ali Kashani sells $278,776 in stock" (reported 2 days ago). Considering the stock, SERVE just IPO'd in 2024 and all of the bigshots are freaking out because of President Trump and China trade uncertainties, this warrants a much deeper look into Serve, Uber and it's many subsidiaries, even Nvidia. #AI #Robotics #CantIgnoreThePumpNDump Linkedin News - "Nvidia has expanded its China workforce by nearly a third this year, Bloomberg reports, citing anonymous sources — part of an intensified focus on autonomous driving technologies. The California-based chipmaker added about 200 people in Beijing specifically to work on self-driving, plus others in after-sales service and networking software development. Though Nvidia faces U.S. trade restrictions in China — as well as a new antitrust investigation from Chinese authorities — the country is still a crucial one, bringing in over $5 billion in sales last quarter." https://lnkd.in/gzYPaJ_C https://lnkd.in/gVJ2kSaH
Why Are Nvidia and Uber Backing This Tiny $400 Million Artificial Intelligence (AI) Company? | The Motley Fool
fool.com
To view or add a comment, sign in
-
BOS Semiconductors Aims for Autonomous Driving with its AI Chips - First Automotive SoC (400 TOPs) to Launch in December 2024 July 2024 BOS has entered into a co-development collaboration last October with Tenstorrent, a Canadian AI company led by CEO Jim Keller, a legendary figure in the semiconductor industry. Under the collaboration, Tenstorrent will supply its NPU (Neural Processing Unit) IP to BOS for its integration into the N1 SoC (System-on-Chip). In addition to Tenstorrent's software support, BOS's own NPU software team will provide additional software solutions for our customers' AI models, enabling customized AI NPU operations." First Automotive NPU 'N1' to Launch in December, Mass Production in 2026, Supports LLM AI Models In May 2024, BOS completed the tape-out process for the N1 automotive NPU accelerator using Samsung Electronics' 5nm process and plans to release its first samples in December. Tape-out signifies that the semiconductor product design is complete and ready for mass production. The N1 chip is set to receive ASIL-B (Automotive Safety Integrity Level) certification and is expected to enter mass production in September 2026. The N1 chip can support AI models, including LLMs(Large Language Models), in addition to autonomous driving functions. The N1 chip boasts an AI computation power of up to 400 TOPS (trillion operations per second) and a power efficiency of 12 TOPS/watts. It also supports #UCIe for #chiplets, PCIe (PCI Express) 5th generation and CXL (Compute Express Link) interfaces and utilizes LPDDR5/5X memory. More on: https://lnkd.in/e6kqh5c5
BOS Semiconductors
bos-semi.com
To view or add a comment, sign in
-
Nvidia’s safety patent highlights that autonomous machines may need to be proactive, not reactive, to keep accidents from happening. Patents like this may seek to show off the capabilities of Nvidia’s GPUs – whether it be for robotics or self-driving cars, said Rhonda Dibachi, CEO at Manufacturing-as-a-Service company HeyScottie. “They’re making money hand over fist doing what they're doing,” said Dibachi. “Nvidia’s saying, ‘I've got this shiny new GPU,’ and people look at that and ask, ‘What can you do with it?’ This patent is answering that question.” https://lnkd.in/gkcPzFwS Plus, Wells Fargo’s recent patent wants to make sure you can trust the cloud: https://lnkd.in/g85FGKYx And Adobe’s recent patent signals that it may be creating a marketing copilot: https://lnkd.in/gbKkVEVZ For more insight on what’s next in tech, check out Patent Drop. https://lnkd.in/gCkSHGsA
Nvidia Robotics, Self-Driving Patents Could Make Its Chips Even More Popular
https://www.thedailyupside.com
To view or add a comment, sign in
-
My latest Medium article dives into Hydra-MDP, the groundbreaking approach by NVIDIA that won 1st place in the CVPR 2024 End-to-End Driving at Scale Challenge! 🏆 In summary, Hydra MDP is a novel framework that combines machine learning and rule-based planning to create a robust and scalable autonomous driving system. It employs a multi-teacher, student-teacher knowledge distillation architecture, allowing the model to learn from both human drivers and rule-based planners. Hydra-MDP also explicitly models safety-related driving scores, setting it apart from previous end-to-end driving systems. Read more here: https://lnkd.in/dXAisTA5 NVIDIA DRIVE NVIDIA NVIDIA AI NVIDIA Robotics #AutonomousDriving #CVPR2024 #HydraMDP #MachineLearning #SelfDrivingCars #NVIDIA
End-to-End Autonomous Driving with Hydra-MDP: A Breakthrough in Machine Learning-Based Planning by…
kargarisaac.medium.com
To view or add a comment, sign in
-
We're excited to announce that Untether AI has joined the Scalable Open Architecture for Embedded Edge (SOAFEE) Special Interest Group, focused on delivering an architecture and reference implementations for software-defined vehicles. As the leader in energy-centric AI acceleration, our expertise will help drive high-performance, low-latency AI solutions for autonomous driving. The centralized hardware platforms in software-defined vehicles create huge demand for efficient AI acceleration at the edge and cloud. By joining and collaborating with SOAFEE's 100+ members spanning automakers and suppliers, we look forward to providing our unique AI acceleration insights to help usher in this transformative era for the automotive industry. #SOAFEE #SoftwareDefinedVehicle #EnergyEfficientAI #ConnectedCars #AI #Edge #CloudComputing #AutonomousDriving #EdgeComputing
Untether AI Joins SOAFEE Special Interest Group to Drive Energy-Centric AI Acceleration Solutions into Software-Defined Vehicles
https://www.untether.ai
To view or add a comment, sign in
-
Software architecture in the automotive world refers to the high-level structure of software systems within vehicles. As modern vehicles become increasingly complex, with more electronic control units (ECUs), sensors, and interconnected systems, software architecture plays a critical role in ensuring that all components work together seamlessly. Key Aspects: 1. **Modularity and Scalability:** Automotive software architecture is designed to be modular, allowing different components (like engine control, infotainment, or advanced driver-assistance systems) to be developed independently and then integrated. This modularity also supports scalability, enabling the same architecture to be used across different vehicle models with varying features. 2. **AUTOSAR (AUTomotive Open System ARchitecture):** A key standard in automotive software architecture, AUTOSAR provides a standardized platform that supports the development of vehicle software by ensuring compatibility and reusability of components across different systems and manufacturers. 3. **Real-Time Constraints:** Automotive software often has to meet stringent real-time requirements, especially in safety-critical systems like braking or steering, where delays could lead to accidents. The architecture must ensure that these systems respond quickly and reliably. 4. **Safety and Security:** Given the safety-critical nature of many automotive systems, the architecture must support rigorous testing and validation processes, adhering to standards like ISO 26262 for functional safety. With the rise of connected and autonomous vehicles, cybersecurity has also become a crucial consideration in software architecture. 5. **Integration of Emerging Technologies:** The architecture must be flexible enough to integrate emerging technologies, such as AI for autonomous driving, V2X communication (vehicle-to-everything), and over-the-air (OTA) updates, ensuring that vehicles remain up-to-date with the latest features and security patches. In essence, software architecture in the automotive world is about creating a robust, flexible, and scalable framework that can support the diverse and evolving needs of modern vehicles, while ensuring safety, reliability, and security. Please feel free to add your thoughts or continue to add some insights in the comment section.
To view or add a comment, sign in
-
Alibaba Cloud and Nividia just announced a new collaboration to develop advanced AI solutions for autonomous driving, integrating Alibaba’s large language models with Nvidia’s automotive computing platform. 🔑 Alibaba's advanced Qwen AI models will be integrated into Nvidia's Drive AGX Orin platform, which is already used by major Chinese electric vehicle manufacturers. 🔑 The partnership aims to enhance in-car voice assistants with more dynamic conversations and intelligent recommendations based on visual and environmental data. 🔑 The companies are also working on adapting Alibaba’s AI models for Nvidia’s next-generation Drive Thor platform, combining advanced driver assitance, autonomous driving, and AI driver capabilities. Two powerhouses in the AI space teaming up to fix issues with autonomous driving is a huge plus for advancing the car industry, but Nvidia deciding to use Alibaba’s Qwen models is an even bigger, and another unexpected win for open-source. #ai #technology https://lnkd.in/gMNhkCXp
Alibaba, Nvidia collaborate on advanced autonomous-driving solution
scmp.com
To view or add a comment, sign in
712 followers