Category: Electronics

  • What If Your Bike Could Understand Brospeak?

    What If Your Bike Could Understand Brospeak?

    Urtopia’s electric motor in titanium frame e-bike to be shown at CES 2025.

    Urtopia, the self-proclaimed leader in lightweight e-bike design, will showcase its next bike technology at CES 2025

    1. Quark Motor. With a weight of 1200g, Urtopia claims this to be the lightest and most potent e-bike motor ever.
    2. Titanium Zero, with a 3D-printed titanium frame.
    3. Urtopia GPTs: The world’s first voice assistant for cyclists.

    The Urtopia Titanium Zero e-bike includes a solid-state battery with an energy density of up to 300Wh/kg but weighs only 10 kg. It is compatible with SRAM’s 12-speed electric shifter and can be equipped with premium components, including carbon wheels, carbon forks, and a titanium alloy seat post.

    The Titanium Zero includes a UI by ChatGPT. With this, you can ask you can interact with your bike as if was your riding buddy. Example given is where to go after post ride. ChatGPT suggests a coffee shop.

    “Bro, don’t you have something more exciting?”

    This leads to a whiskey bar but that is cut short with a reminder of an appointment and having to go home in “turbo mode.” It’s scary fast. The rider is not heard from again.

    Despite the obvious staging, it’s apparent how well a natural language interface would be better than having to deal with pressing buttons on a cycling computer when zipping down the road on two wheels.

    Explore more about Urtopia GPTs here.

  • Siemens Hangs on to Cash, Gets $10.5 Billion Loan to Finance Altair Acquisition

    Siemens Hangs on to Cash, Gets $10.5 Billion Loan to Finance Altair Acquisition

    Siemens AG has secured a $10.5 billion bridge loan to finance its acquisition of Altair Engineering Inc., marking its largest-ever software company acquisition. The loan was fully underwritten by Goldman Sachs and is slated for syndication among more than ten of Siemens’ partner banks.

    Oops. I’m a little short. Someone spot me a couple billion?  Siemens had $9.16 billion in cash and cash equivalent but had agreed to pay $10.6 billion for Altair.

    The acquisition agreement, announced on October 30, 2024 and covered here, involves Siemens purchasing Altair at $113 per share, representing an 18.7% premium over Altair’s closing price on October 21. The acquisition price represents 14 times the annual revenue Altair was projected to make in all of 2025.

    “Starting in fiscal 2025, we will take Siemens to the next level of value creation,’ said Siemens CEO Roland  Busch in the company’s Q4 financial report. “We will continue to invest in R&D and M&A to secure faster growth based on our technological strengths and ability to scale across industries. Our planned acquisition of Altair reinforces our leadership in industrial software and AI.”

    Siemens’ strategy is to integrate hardware and software solutions to bridge the real and digital worlds.

    Altair, a Detroit-area company known almost entirely for its simulation software is big in the automotive sector, but also has customers in other manufacturing industries, consumer goods, and energy.

    But Siemens getting a loan was a bit of a surprise to us. It was a change of plans for Siemens, which initially announced it had all the means to buy Altair without assistance.

    “The acquisition will be fully cash-financed from Siemens’ existing resources and its capacity to fully finance the transaction based on Siemens’ strong balance sheet,” reads the October 30 announcement of the acquisition.

    Keeping the money in the bank lets Siemens make more big acquisitions, which the company seems eager to do.

    “This was definitely not the last acquisition we make in the area of software,” said managing board member Cedrik Neike to German newspaper Handelsblatt in an interview published on Friday. “We have the financial strength to be able to do further deals.”

  • CPUs and GPUs. Why Not One Chip to Do It All?

    CPUs and GPUs. Why Not One Chip to Do It All?

    What do these guys know that we don’t? One chip can do everything. Now they have funding to develop a prototype.

    German hardware startup Ubitium has secured funding to develop what the company says is a “revolutionary universal processor.” Ubitium’s plan is to have one chip that does it all without having to have separate processors for different functions. The do-it-all chip will be built around RISC-V architecture.

    What is RISC-V?

    RISC-V is an open-standard instruction set architecture (ISA) based on reduced instruction set computer (RISC) principles. Unlike proprietary ISAs such as x86 (used by Intel and AMD) or ARM, RISC-V is free and open, meaning anyone can use, modify, and implement it without paying licensing fees. Successful implementation of RISC-V architecture has been NVIDIA, which has RISC-V cores for embedded controllers in its GPUs, SiFive, Western Digital and Alibaba’s semiconductor division, T-Head.

    The whole processor is built around boundaries placed on computing tasks, says Hyun Shin Cho, CEO of Ubitium. “We’re erasing those boundaries. Our Universal Processor does it all – CPU, GPU, DSP, FPGA – in one chip, one architecture. This isn’t an incremental improvement. It is a paradigm shift.”

    It does AI as well, according tot the company’s press release.

    Previous attempts to combine CPU, DSP, GPU and FPGA functions have resulted in what is known as System on a Chip (SoC). Apple designs and utilizes SoC architecture in its blazingly fast M-Series chips, which devote part of itself to a neural engine. Other SoCs are made by Xilinx, Intel (Stratix 10) and AMD’s Versal ACAP.

    Who’s Behind This?

    Ubitium’s founder and CTO, Martin Vorbach, holds over 200 patents for semiconductor technology. He is keeping his CTO positions at PACT XPP Technologies and Hyperion-Core, where he is also listed as the founder in his LinkedIn profile. Vorbach studied computer science at Germany’s Karlsruhe Institute of Technology (KIT), where he met Hyun Shin Cho.

    CEO Hyun Shin Cho has 20 years of experience in various industrial sectors. He studied engineering at KIT and Purdue.

    Chairman Peter Weber has worked at Intel, Texas Instruments, and Dialog Semiconductor.

    About Ubitium

    Ubitium has raised $3.7M in a seed funding round led by Runa Capital, Inflection, and KBC Focus Fund.

    The company has offices in Düsseldorf and Rülzheim in Germany.

  • The Robots Are Coming. This Time, It’s for Good.

    The Robots Are Coming. This Time, It’s for Good.

    In a world enthralled with AI and in love with NVIDIA’s founder/CEO Jensen Huang for making the chips that make it happen, NVIDIA’s stock is gold.

    At the company’s annual developer conference, GTC 2024, held earlier this year, Huang made it quite clear: he was betting all his chips on humanoid robots.

    In an Acquired podcast, Huang is on top of the world yet humble enough to give credit to OpenAI for the AI revolution. He only takes credit for positioning NVIDIA near enough to AI technology so that when it bloomed, his company and his company alone would have the chips available for it.

    “You want to be near where the apple falls,” says Huang. Even if I’m not sure what and when I position the company to be near the tree, we may have to do a diving catch for it. Even if you don’t catch the apple before it hits the ground, you want to be the first one to pick it up.”

    Eyes up here. Driver expressions and whether they are nodding off, a use of Nvida’s OmniVision system. Image NVIDIA.

    AI was not the first apple Huang picked up. Before AI, he was equally gung-ho about autonomous vehicles. Previous GTC’s featured vehicles emblazoned with NVIDIA logos, their backs filled with GPU-laden computers. A massive TuSimple tractor cab was on the show floor, and their CEO was on stage. AVs needed enormous computing power on the spot. There was no time to bounce signals off remote servers. The lag caused by accessing far-off servers could kill.

    However, when an AV did kill, as it did on March 18, 2018, in Tempe, Arizona, the fallout caused Uber to abandon its plans for AVs. TuSimple laid off 75% of its US workforce and abandoned its plans to automate the truck industry. The auto industry and investor community lost interest. There was not to be a world full of people being driven around by their vehicles any time soon, nor the windfall NVIDIA may have expected.

    AI may not be as intelligent as we would like it to be, but since it hasn’t killed anyone yet, the AI fever continues unabated. Because of it, NVIDIA’s revenue, profits and valuation are through the roof.

    What’s Next?

    Where’s the next apple going to fall? At the company’s annual developer conference, GTC 2024, held earlier this year, Huang made it quite clear: humanoid robots. And once again, he had the hardware for it. NVIDIA’s hardware, and only NVIDIA’s hardware, would allow humanoid robots to finally hit their stride.

    Humanoid robots have been the stuff of sci-fi movies for a century. Fritz Lang’s 1927 Metropolis starred a humanoid robot. Since then, most robots have depicted robots as killers, culminating in Terminator. Sex robots caused a brief sensation. Disney kept the dream alive with robots in their theme parks, but robots for home use were mainly novelty items. We had all resigned ourselves to evil robots in the movies and good robots on the assembly lines.

    Then came GTC 2024 and Jensen Huang on the main stage… and materializing from the darkness behind him an entire chorus line of robots. Were they about to dance and high kick like Dallas Cowboy cheerleaders? I prayed not. They were far too close to Huang. I could not imagine NVIDIA without Jensen.

    I was, in my moment of panic, missing the point Huang was making: Humanoid robots were going to be the next big thing. And luckily for us all, the humanoid robots remained safely stationary. The only robot action was with a harmless, knee-high robot on wheels that responded to Huang’s commands about as much as my pug dog.

    What Would Jensen Do?

    Since Jensen Huang has shown a  Midas touch with AI, he easily commands our attention.

    NVIDIA’s interest in humanoid robots is driven by its vision for a future where robots can seamlessly operate in human environments. CEO Jensen Huang explains that since the world is structured around human needs—like factories designed for human workers—humanoid robots are well-suited to perform tasks in these spaces. By training robots with data based on human movements and behaviors, NVIDIA aims to create machines that can learn and adapt to complex tasks autonomously, ranging from assembly line work to complex navigation.

    This aligns with NVIDIA’s broader mission to advance embodied AI and autonomous robotics, areas they believe will not only support industrial productivity but also create jobs. Through projects like Project GR00T and platforms like NVIDIA Isaac, NVIDIA is building a comprehensive toolkit for developing and training humanoid robots. This approach allows these robots to better understand and perform human-like tasks, which Huang sees as foundational to the future of robotics and artificial intelligence. NVIDIA’s investments also capitalize on the potential of AI to handle global challenges, such as labor shortages and industrial efficiency, positioning the company as a leader in the robotics and AI sectors.

    Holoscan and Machine Vision

    NVIDIA Holoscan is a comprehensive AI sensor processing platform developed by NVIDIA that is tailored for the low-latency processing of sensor data. It includes optimized libraries for AI data processing and offers core microservices for managing streaming, imaging, and networking applications. Holoscan supports embedded devices, edge computing and cloud platforms.

    NVIDIA Jetson is an edge AI and robotics platform designed to provide high-performance computing for applications in robotics, autonomous machines and AI at the edge. It includes compact, powerful modules and a software development kit (NVIDIA JetPack).

    NVIDIA offers Jetson for products that need object recognition, autonomous navigation, and complex data processing. It integrates with NVIDIA’s Holoscan, Metropolis, and Isaac, which simplify development for applications in industrial automation, smart cities, healthcare and more.

    The “eyes” of humanoid robots are sensors and processors like OMNIVISION’s OG02B10 image sensor and OAX4000 processor. The sensor captures visual data, much like a human eye, enabling robots to see their surroundings in precise, high-resolution detail, even in motion. Meanwhile, the OAX4000 processor acts like the visual cortex, interpreting this data by enhancing image quality, managing multiple camera inputs, and processing it for the robot’s AI system to make decisions.

    The ”brain” in a humanoid robot is typically the AI and processing unit, which interprets sensor data, makes decisions and drives the robot’s actions. In robots built on NVIDIA platforms, this is often handled by processors like the NVIDIA Jetson or more advanced AI processors such as NVIDIA’s Thor and Project GR00T. These processors are designed to run complex AI algorithms and machine learning models that allow the robot to “think” and respond in real-time.

    Yes, NVIDIA Jetson and similar processors in humanoid robots are considered edge computers. Edge computing refers to processing data close to where it is generated (at the “edge” of the network), which reduces latency and enhances responsiveness. This approach is crucial for real-time applications like robotics, where immediate processing is needed for tasks like obstacle avoidance, facial recognition, and decision-making.

    For instance, NVIDIA Jetson modules are built to perform AI and deep learning computations directly on the device, without relying on remote cloud servers. This capability makes them ideal for powering the “brain” of a humanoid robot, enabling it to process sensor data from machine vision components quickly and act in real time.

    For example, NVIDIA Jetson is equipped with powerful GPUs and AI acceleration capabilities that enable the robot to process visual and environmental data from vision sensors. This information is then used by the AI models running on Jetson or another processor to interpret surroundings, plan movements, and interact with humans in a responsive manner.

    The Market for Humanoid Robots

    The humanoid robot market is expected to experience rapid growth over the next decade, driven by advances in artificial intelligence, robotics, and increasing demand across industries such as healthcare, personal assistance, and retail. The global market, valued at around $1.5 billion in 2023, is projected to expand significantly, reaching up to $66 billion by 2032, with a compound annual growth rate (CAGR) ranging from 34% to 62%, depending on the market segment and region.

    Aging populations and the need for automation in caregiving are major growth drivers, as humanoid robots are increasingly used for companionship, healthcare assistance, and rehabilitation. In industries like retail and hospitality, robots enhance customer service through tasks like guiding, providing product information, and handling transactions. In addition, regions like Asia-Pacific, led by countries such as Japan and South Korea, are at the forefront of adopting humanoid robots, driven by strong governmental support and robust tech infrastructure.

    However, challenges remain, such as high costs, security from hacks, and safety, especially as humanoid robots integrate with humans. But with mass production and technical advances, the humanoid robot market is bound to advance — as long as they don’t kill someone.

  • OMNIVISION Claims Smallest-Ever IR Sensor

    OMNIVISION Claims Smallest-Ever IR Sensor

    OMNIVISION has introduced the OV0TA1B, which it claims has the smallest footprint of any IR CMOS sensor, able to fit in the tight quarters found in ultrathin devices such as smartphones, tablets and laptops, some of which have a “3mm Y dimension” requirement.

    For comparison, the bezel of the latest iPhones is an impossibly thin 1.15 mm, but its expanded bezel, the “Dynamic Island,” provides a generous 7.7 mm of height.

    Not a lot of room in there. Backward facing sensors in Apple iPhone 14. Apple uses Sony’s CMOS sensors, but is rumored to be switching to Samsung’s image sensor. The OV0TA1B may put OMNIVISION in contention. Image: Applemagazine.com.

    The small form factor of the OV0TA1B makes it ideal for human presence detection (HPD) and facial recognition on such devices. It has a 2-micron (µm) pixel, and it provides 440 x360 resolution at  30 frames per second. The company claims it requires such little energy consumption that the sensor can be kept on at all times.

    The OVOTA1B use cases offered by OMNIVISION:

    • User validation by facial recognition, which has become common in smartphones and tablets
    • Counting people with webcams, such as those found in surveillance systems. (OMNIVISON may or may not be used in surveillance systems.)

    OMNIVISION competes with several companies in the miniature infrared (IR) CMOS sensor market, including Sony, STMicroelectronics, ON Semiconductor, Samsung and Canon.

    These companies, along with OMNIVISION, are key players in the development and production of miniature IR CMOS sensors. Apple has primarily utilized Sony’s CMOS image sensors in its iPhones for over a decade. In December 2022, Apple CEO Tim Cook confirmed this longstanding partnership.  However, recent reports indicate that Apple is considering incorporating Samsung’s image sensors into future iPhone models, potentially as early as 2026. 

  • OMNIVISION Rolls Out Machine Vision System at VISION 2024

    OMNIVISION Rolls Out Machine Vision System at VISION 2024

    OMNIVISION has announced that its camera system, including the OG02B10 global shutter sensor and OAX4000 image processor, has been blessed by NVIDIA and is available with NVIDIA’s Holoscan and Jetson platforms.

    Together, the OG02B10 and OAX4000 are the eyes and brain, respectively, of machines that need to see, such as autonomous vehicles or humanoid robots, which can be used for navigation, object detection as well as interaction with humans. Having as much of the machine vision system on the spot or embedded is critically important to rapid response loops. You do not want either an autonomous vehicle or a robot to have to wait to get its instructions from a remote server in the cloud.

    The OG02B10 color global shutter (GS) image sensor, with its 3x 3-micron pixel technology, delivers precise, high-quality images even in low light without producing blur or artifacts, according to the release.

    The OAX4000 is an image signal processor (ISP) that can manage images from up to 4 cameras with a 140 dB high dynamic range (HDR), according to OMNIVISION. It can handle multiple vision sensors, which is what gives humanoid robots a spatial sense, just as your stereoscopic vision gives you depth perception.

    NVIDIA Holoscan is the full-stack, an AI sensor processing platform that connects embedded systems to the edge and the cloud.

    NVIDIA Jetson is a platform for edge AI and robotics.

    OMNIVISION had the combined machine vision system on display at their booth at the VISION conference and exhibition, which was recently held at Messe Stuttgart in Germany.

  • Lucid to Use Everspin Technologies’ PERSYST MRAM for Gravity SUV

    Lucid to Use Everspin Technologies’ PERSYST MRAM for Gravity SUV

    Grand frunk central. Lucid Gravity’s frunk has a seating option. Image: Lucid Motors.

    Lucid Motors will be using Everspin Technologies’ PERSYST MRAM for its Gravity luxury all-electric SUV, which is expected to be delivered to eager customers in early 2025.

    The Gravity will have an undisclosed number of 256Kb MRAMs. In automotive applications, persistent memory like MRAM is typically deployed in key places rather than in large quantities like ordinary RAM. An EV might use a few MRAM chips for critical functions—one or two per key system, such as the powertrain control module, battery management system, or main control units.

    “The selection of our PERSYST product family is a testament to the reliability and performance our MRAM products provide in demanding environments,” said Everspin CEO Sanjeev Aggarwal, as MRAM’s role in automotive data resilience grows.

    MRAM stands for magnetoresistive random access memory, a type of non-volatile memory that uses magnetic states to store data rather than an electric charge. Non-volatile memory retains information even when the power is turned off.

    The magnetoresistive RAM concept first came about in the 1970s by researchers at IBM and Motorola, and later Freescale, but it was in the 1990s that the technology turned into a product. Freescale Semiconductor is credited with creating the first commercially viable MRAM in 2006. Everspin, a spinoff from Freescale, advanced the whole concept with its “Spin-Transfer Torque MRAM (STT-MRAM), which emerged in the 2000s with an advantage in automotive applications, according to Everspin sources.

    Lucid may have picked Everspin’s MRAMs after its experience with using them in the design of the Lucid Air, which uses the MR25H256AMDF 256Kb chip integrated into its master powertrain system.

    MRAM chips claim several advantages over other types of non-volatile memory, such as Flash or FeRAM, particularly in applications requiring enduring memory retention, fast speeds, and reliable data integrity. Flash memory degrades with repeated MRAM and has virtually unlimited write cycles. This should appeal to vehicle manufacturers who do not want to service warranty claims and vehicle owners after the warranty period.

    Everspin’s MR25H256A has little resemblance to the RAM you would find in your temperature-controlled, stationary PC. They meet hardship requirements imposed by all-weather, shock and vibration resistance of vehicles. For example, automotive MRAMs have to operate reliably in temperatures ranging from -40°C to +125°C.

    Unlike Flash, DRAM, FeRAM, MRAM claims to retain data with greater reliability and, lower power requirement and lower power consumption and is more resilient to temperature fluctuations (electrical engineers with more knowledge of volatile memory than I am welcome to correct me)

    Everspin is not alone with an MRAM chip. IBM was an early developer of MRAM technology and has conducted significant research into improving MRAM speed and density. Samsung has been actively developing MRAM for use in its consumer electronics and offers MRAM-based embedded memory for integrated circuits. The biggest semiconductor manufacturer in the world, Taiwan’s TSMC, manufactures embedded MRAM (eMRAM) for customers in automotive, IoT, and other applications.

    Everspin Technologies (from Everspin sources)

    Founded in 2008, Everspin Technologies is headquartered in Chandler, Arizona, and is a leader in magnetoresistive random access memory (MRAM) technology. The company designs and manufactures high-performance MRAM solutions, including persistent MRAM and Spin-Transfer Torque MRAM (STT-MRAM), which are used in data-intensive applications across various industries such as automotive, industrial IoT, and data centers. Everspin’s products are known for their endurance, fast read/write speeds, and data integrity, making them ideal for environments requiring reliable, non-volatile memory. Everspin’s MRAM solutions, such as the PERSYST MRAM family, offer virtually infinite write cycles, low latency, and resilience under extreme temperatures, positioning them as critical components in sectors needing robust data retention. With a customer base spanning diverse mission-critical industries, Everspin reported annual revenue of approximately $54 million in 2023, reflecting strong demand for durable memory solutions that ensure data

  • Altium to Introduce Discover, Develop and Lifecycle at Electronica 2024

    Altium to Introduce Discover, Develop and Lifecycle at Electronica 2024

    Altium will unveil three applications at the upcoming Electronica 2024 trade fair in Munich.

    Altium Discover: a digital marketplace from which electronics and electrical designers can select components. Discover is a cloud-based platform meant to be a collaborative ecosystem for the electronics supply chain, connecting semiconductor manufacturers, distributors, and developers. It will have real-time collaboration, persistent context, and transparent digital rights management. According to Altium, this should assist in reducing supply chain issues.

    Altium Develop: For small and medium-sized businesses (SMBs), Altium Develop combines electronics, mechanical, and software design into a single workflow. It sounds like Autodesk Fusion, though we expect it to emphasize the EDA aspect of electromechanical design.

    Altium Lifecycle: Altium Lifecycle offers comprehensive lifecycle control with compliance and audit trail features, ensuring reliable management across the product’s journey. It addresses regulatory needs and reinforces reliability throughout the product lifecycle, making it an ideal solution for enterprises focused on precision. It is meant for large enterprises that want to evaluate the environmental cost of their products or are being forced to do so.

    Electronica

    Electronica is one of the world’s largest trade fairs for the electronics industry, held biennially at the Messe München convention center in Munich, Germany. This year’s event will run from November 12-15, 2024. It is expected to draw approximately 70,000 attendees from around the world and has over 2,800 exhibitors. Expect every major electronic product company to be, as well as thousands who supply them, from semiconductors to advanced systems and applications, including Cadence Design Systems, Zuken, and Siemens EDA. See the exhibitor list.

    Altium

    Altium, headquartered in La Jolla, California, is a leading EDA provider in the electronics design industry, mainly known for its flagship product, Altium Designer, and the Altium 365 cloud platform. These tools are widely used for printed circuit board (PCB) design, serving sectors like automotive, medical, aerospace, and telecommunications. Altium has solidified its position in the market by providing a comprehensive ecosystem that integrates design, component sourcing, and lifecycle management in the cloud, making it one of the most prominent players in this domain.

    Altium’s annual revenue in fiscal year 2023 was approximately $263 million, reflecting its growth as more companies adopt digital and cloud-based PCB design tools. The company also reported a significant portion of this revenue as recurring, driven by solid demand for subscription-based access to its software, supporting Altium’s growth trajectory in the global PCB design and electronics market.