Connect with us


Automotive Electronic Systems—a View of the Future [icnweb]



By Ron Wilson, Editor-in-Chief, Altera Corporation

Automotive driver assist systems (ADAS) are the hot topic today in automotive electronics. The systems range from passive safety systems that monitor lane exits, active safety systems like advanced cruise control, to, in the future, situation-aware collision-avoidance systems. The increasing demands ADAS evolution places on data transport and computing are fundamentally changing automotive electronics architectures. And it is becoming clear that these changes foreshadow the future for many other kinds of embedded systems.


Goals and Requirements

Today vehicle-safety electronic systems are isolated functions that control a specific variable in response to a specific set of inputs. An air-bag controller, for example, detonates its explosive charge when an accelerometer output trips a limit comparator. A traction-control system applies a brake to reduce torque on a wheel when the data stream from a shaft encoder indicates sudden acceleration. While these systems make contributions to vehicle safety, they can also act inappropriately because their inputs give them a very narrow view of the world. Hitting a pot-hole or bumping into a car while parking can fire an air bag. A rough road can puzzle traction control.

All that is about to change, according to Steve Ohr, semiconductor research director at Gartner. “Advanced air-bag controllers have multiple sensors that literally vote on whether a crash is happening,” Ohr explained in introduction to his panel at the GlobalPress Summit in Santa Cruz, California, on April 24. “In the near future, the controllers will consult sensors that monitor passengers and cargo to identify how best to deploy the various air bags during a crash.”

At this point, the air-bag controller has crossed a critical threshold: from responding to an input to maintaining—and responding to—a dynamic model of the vehicle. This change, Ohr emphasized, is being echoed in other systems throughout the vehicle, with profound consequences. “We see the same pattern in safety systems such as lane-exit monitors and impending-hazard detectors,” Ohr stated. “Each system is getting more intelligence, moving to sensor integration and then to sensor fusion.”

This evolution is happening in an already astoundingly complex environment. Panelist Frank Schirrmeister, senior director of product marketing at Cadence Design Systems, observed “In 2010, a high-end car could have 750 CPUs, performing 2,000 different functions, and requiring one billion lines of code.” Schirrmeister said that this degree of complexity was forcing developers to adopt hardware-independent platforms such as Automotive Open System Architecture (AUTOSAR), and integrated mechanical-electrical-software development suites. In this fog of complexity, system designers are struggling to cope with a sudden surge of change in the way the systems handle data.


Isolation to Fusion

Hazard-avoidance systems offer a microcosm of this sweeping changes, according to panelist Brian Jentz, automotive business-unit director at Altera Corporation. Today, relatively simple systems like back-up cameras can already have significant processing requirements, Jentz said. “Inexpensive cameras need fish-eye correction to fix the perspective so drivers can interpret the display easily.” These cameras also need compensation to produce useful images in low light, and often they will require automated object recognition. These functions can be done better in the camera, but it’s often cheaper to do them in the central engine control unit (ECU). “Cameras are moving to high-definition,” Jentz continued, “and this can mean megapixels per frame. If you are sending images to the ECU, you may have to compress the data before it leaves the camera.”

Further evolution will complicate the data transport problem further. Hazard detection will move from simply showing an image from a rear-facing camera to modeling the whole dynamic environment surrounding the car. At this point the system must stitch together images from multiple cameras—at least eight for a 360-degree view with range and velocity detection, as shown in Figure 1. A central processor is absolutely necessary, and the ADAS must transport many streams of compressed video to the ECU concurrently.

Figure 1. Placement and use of cameras determines the algorithms required to process the images.

But things get harder still. Video cameras are hampered by darkness and disabled by rain, snow, road spray, and other sorts of optical interference. So designers team the video cameras with directed-beam, millimeter-wave radar to improve reliability in low-visibility conditions. Now the ECU must fuse the video data with the very different radar signal in order to interpret its surroundings. This fusion will probably be done using a system-estimation technique called a Kalman filter.

Kalman and its Discontents

A Kalman filter can take in multiple streams of noisy data from different sorts of sensors and combine them into a single, less-noisy model of the system under observation. It does this, roughly speaking, by maintaining three internal data sets: a current estimate of the state of the system, a “dead reckoning” model—usually based on physics—for predicting the next state of the system, and a table rating the credibility of each input. On each cycle, the Kalman filter assembles the sensor data and uses it to create a provisional estimate of the system state: for example, the locations and velocities of the objects surrounding your car. Simultaneously, the filter creates a second estimate by applying the dead-reckoning model to the previous state: the other cars should have moved to here, here, and here, the pedestrian should have walked that far, and the trees should have stayed where they were. Next, the filter compares the two state estimates, and taking into account the credibility ratings of the inputs, updates the previous state with a new best estimate: here’s where I think everything is really. Finally, the Kalman filter sends the new state estimate to the analysis software so it can be evaluated for potential hazards, and it updates its sensor-credibility table to make note of any questionable inputs.

The good news is that the Kalman filter can assemble a stable and accurate model of the outside world despite intermittent readings, high noise levels, and a mix of very different kinds of sensor data. But there are issues, too. Kalman filters working with high-definition (HD) video inputs can consume huge amounts of computing power, and the analytic routines they enable can take far more, as suggested in Figure 2. “Algorithm development is already ahead of silicon performance,” Jentz noted. “There is basically an unlimited demand for performance.”

Figure 2. Sensor fusion concentrates many heavy algorithms and network terminations on one chip.

There is another issue with important system implications. While Kalman filters are inherently tolerant of noise, they cannot be immune to it. And variations in the latency between the sensors and the ECU—particularly if the variation is large enough for samples to arrive out of order—appear as noise. Such latency variations can cause the filter to reduce its reliance on some sensors, or to ignore altogether information that could have made a vital difference.

This is important because of trends in vehicle network architectures. Purpose-built control networks such as the controller-area network (CAN) or the perhaps-emerging FlexRay network can limit jitter and guarantee delivery of packets carrying some sensor data, although they may lack the bandwidth for even compressed HD video. In principle, system designers could calculate the bandwidth they need for a given maximum jitter, and then provision the system with enough network links to meet the need, even if that resulted in dedicated CAN segments for each camera and radar receiver. But in practice, automotive manufacturers are headed in a different direction: cost control.

“The direction is Ethernet everywhere in the car,” argued panelist Ali Abaye, senior director of product marketing at Broadcom. Abaye said that as the number of sensors increases, cost-averse manufacturers—including the high-end brands—are trying to collapse all their various control, data, and media networks onto a single twisted-pair Ethernet running at 100 Mbits or 1 Gbit.

But a shared network raises the latency issue again. Because Ethernet creates delivery uncertainties, some sort of synchronizing protocol—IEEE 1588, Time-Triggered Protocol (TTP), or Audio Video Bridging (AVB)—would appear necessary. “This is still an active discussion,” Schirrmeister said. “The existing protocols are not yet sufficient for everything these systems need to do.” Abaye, who has 100 Mbit transceivers to sell, is more confident. “Our opinion is that the AVB protocol is sufficient,” he stated.

These debates will have system implications well beyond the cost of cabling. Gigabit Ethernet implies silicon at advanced process nodes, where issues like cost, availability, and soft-error rates become questions. Synchronizing protocols are not exactly light-weight, implying the need for more powerful network adapters. And the need to store and possibly reorder frames of time-stamped data from many sensors could impact memory footprints.


A Multibody Problem

As a final point, when you put radar or scanning lasers into the ADAS architecture, you get a fascinating side-effect. The ADAS on nearby vehicles can now interact with each other. This could lead to sensor interference, or even to an unstable multivehicle system in which two cars hazard-avoid right into each other. This is not a whimsical question: there are hazard-avoidance algorithms that, when used by multiple vehicles in the same traffic stream, are known to lead inevitably to crashes.

“There has already been some research into the behavior of multi-ADAS systems,” Schirrmeister said. “It is an area of continuing interest.”

Such questions will almost certainly involve regulatory agencies in North America and the European Union in the design of ADAS algorithms at some level. Schirrmeister speculated that in developing countries, where cities can spring up and create all-new infrastructure as they go, there may be a move to coordinate ADAS evolution with the development of smart highways.

In any case, it is clear that verification of these systems will involve a significant degree of full-system, and perhaps multisystem, modeling. These will be huge tasks, going well beyond the experience of most system-design teams outside the military-aerospace community.

We have traced the evolution of one automotive system, ADAS, from a set of isolated control loops to a centralized sensor-fusing system. Other systems in the car will follow the same evolutionary path. Then the systems will begin to merge: ADAS, for example, working with the engine-control and traction systems can bypass the driver altogether and maneuver the car away from trouble. The endpoint is an autonomous vehicle—and a network of intelligent control systems of stunning complexity built around a centralized model of the outside world.

Continue Reading
Click to comment

댓글 남기기


RFID와 무선시스템을 통한 식품 자재창고 식별 솔루션 구축



터크 RFID및 무선통신 시스템으로 실시간 생산 모니터링 및 물류, 유통기한 관리까지 구현하다!

터크 RFID 및 무선통신 시스템;
실시간 생산 모니터링 및 물류, 유통기한 관리까지

최근 RFID기술은 전세계적으로 물류 및 창고관리 뿐만 아니라 생산라인에도 널리 적용되고 있다. 특히, 중국 기업들은 원자재 처리, 생산, 저장 및 운송에 RFID 기술을 사용해 보다 효율적인 프로세스를 구축하는데 많은 노력과 투자를 하고 있다. 이를 통해 식품업계에서 중국기업의 입지가 높아지고 있다.

터크 RFID및 무선통신 시스템으로 실시간 생산 모니터링 및 물류, 유통기한 관리까지 구현하다!

터크 RFID및 무선통신 시스템으로 실시간 생산 모니터링 및 물류, 유통기한 관리까지 구현하다! (사진. 터크)

업계에서 유명한 중국의 한 회사는 창고 관리에 신뢰성있는 비용 절감 기술을 필요로 했다. 이에, 터크의 RFID 기술로 원자재 운송의 추적 및 관리를 수행하는 실시간 자재 관리 시스템을 구현했다. 이 시스템은 자재창고 데이터 목록을 확인하고 각 팔레트를 재고로 지정한 뒤, RFID 읽기/쓰기 헤드 (안테나)를 통해 팔레트에 부착된 데이터 캐리어로 데이터를 기록하는 방식으로 구성된다. PLC는 창고 내에 팔레트의 위치를 자동으로 지정하고 해당 위치에 자재가 보관되도록 지게차를 안내한다. 이 때 사용되는 지게차에는 읽기/쓰기 헤드 (안테나)가 설치되어 자재 및 제품의 위치 및 정보를 중앙 관제 시스템에 저장한다.

터크의 IP67 등급 RFID 모듈 BL ident와 BL67 필드버스 게이트웨이를 선택해,<br />지게차에 이 제품들을 설치하여 건조하거나 습한 환경 모두에서 작동이 가능하도록 구현했다.

터크의 IP67 등급 RFID 모듈 BL ident와 BL67 필드버스 게이트웨이를 선택해,<br />지게차에 이 제품들을 설치하여 건조하거나 습한 환경 모두에서 작동이 가능하도록 구현했다. (사진. 터크)

높은 가용성을 위해 시스템은 전체 데이터가 중앙 관제 시스템과 분산데이터로 이중화되었고, 출고시에는 선입선출법을 적용했다. 시스템은 생산 작업에 따라 자재가 정확한 생산 라인으로 이동되도록 지게차를 안내한다. 운송 도중 지게차의 읽기/쓰기 헤드는 팔레트의 데이터 캐리어를 확인해 자재 필요 여부를 확인한다. 오류 발생 시, 시스템은 알람을 내보내고 자재를 채우기 위해 취해야 할 조치를 자동으로 표시한다.

팔레트 적재제품을 한번에 리딩하는 RFID 기술

팔레트에 이미 RFID 태그가 설치되어 있으므로 다량의 제품에 대한 정밀한 디지털 관리가 매우 간단하다. 이제 작업자가 수많은 바코드 라벨을 미리 인쇄하여 팔레트에 붙이고 바코드 스캐너로 스캔 할 필요 없이 태그형 팔레트를 통해 이 모든 작업을 한 번의 읽기/쓰기 과정에서 완료할 수 있다. RFID 태그는 표면의 스크래치나 얼룩이 영향을 주지 않기 때문에 까다로운 물류 환경에도 적용이 가능하고, 재활용 역시 가능하다. 태그는 바코드 라벨과 달리 우천 시에도 사용할 수 있고, 특정 태그의 UID 정보는 자재 트래킹 정밀도를 향상시킬 수 있다.

습도가 높은 환경에서 생산되는 글루탐산 나트륨은 분말 원료를 별도로 보관해야 하기 때문에 이 시스템은 습한 환경뿐만 아니라 건조하고 먼지가 많은 환경에서도 완벽하게 작동할 수 있어야 했다. 고객은 이러한 이유로 터크의 IP67 등급 RFID 모듈 BL ident와 BL67 필드버스 게이트웨이를 선택하였고, 지게차에 이 제품들을 설치하여 건조하거나 습한 환경 모두에서 작동이 가능하도록 구현했다. BL ident 읽기/쓰기 헤드는 운송 중에도 데이터의 읽기 및 쓰기가 가능하므로 지게차에 장착하여 사용할 수 있었다.

팔레트에 이미 RFID 태그가 설치되어 있으므로 다량의 제품에 대한 정밀한 디지털 관리가 매우 간단하다.

팔레트에 이미 RFID 태그가 설치되어 있으므로 다량의 제품에 대한 정밀한 디지털 관리가 매우 간단하다. (사진. 터크)

무선통신 시스템으로, 움직이는 지게차 게이트웨이의 데이터를 취합하다!

이 시스템 구현의 가장 큰 과제는 지게차의 게이트웨이를 중앙 관제 시스템으로 연결하는 것이었다. 지게차가 계속해서 움직이기 때문에 케이블로 연결하는 것은 불가능했고, 여러 번의 논의와 테스트를 거쳐, 터크는 프로그래밍 가능한 게이트웨이와 제어 레벨의 통신이 가능한 무선 이더넷 네트워크 솔루션을 제안했다. 이 솔루션으로 고객은 생산 시스템의 전체 물류관리 및 제품 유통기한 관리를 실시간으로 구현했다.

프로젝트 담당자는 “터크의 읽기/쓰기 헤드의 외형이 근접센서와 동일해서 사용이 용이하고, 설치가 쉬었습니다. 0~ 200mm의 작동거리는 어플리케이션 요구사항을 만족시켰고, 터크는 견고한 하우징의 RFID 시스템과 함께 50미터 길이의 조립 가능한 연결 케이블을 함께 제공하여 열악한 환경 조건에서도 안정적인 데이터 전송을 보장하였습니다. 또한 무선 네트워크는 게이트웨이와 네트워크 노드만으로 간단히 구현하였습니다.”라고 전하며 터크의 BL Ident 시스템의 장점을 요약했다. [제공. 터크코리아]

박은주 기자

Continue Reading


IEEE Announces IT Healthcare Standard in Advance of the IEEE Annual International Conference of EMBS



The 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS) to highlight latest advancements in biomedical engineering, healthcare technology R&D, translational clinical research, technology transfer and entrepreneurship, and biomedical engineering education.

IEEE, the world’s largest technical professional organization dedicated to advancing technology for humanity, and the IEEE Standards Association (IEEE-SA), today announced the availability of IEEE 11073-20702™—Health informatics: Point-of-Care Medical Device Communication—Standard for Medical Devices Communication Profile for Web Services in advance of the upcoming EMBC ’17 to be held 11-15 July at the International Convention Center on JeJu Island, South Korea. IEEE also announced the approval of two new healthcare-related standards projects: IEEE P1708™—Standard for Wearable Cuffless Blood Pressure Measuring Devices, and IEEE P1752™—Standard for Mobile Health Data.

Sponsored by the IEEE 11073™ Standards Committee (EMB/11073), IEEE 11073 family of standards are utilized to structure data and enable data transmission across multiple healthcare devices, ensuring effective interoperability and communication between medical, health care and wellness devices, as well as with external computer systems. IEEE 11073-20702 defines a communication protocol specification for a distributed system of point-of-care (PoC) medical devices and medical IT systems that need to exchange data, or safely control networked PoC medical devices, by profiling Web Service specifications.

“IEEE 11073-20702 is the first standard that streamlines the integration of medical devices with the IP stack common to IT specialists around the world,” said Bill Ash, strategic technology program director, IEEE-SA. “Ensuring the safe and secure transmission of healthcare data is essential to advance value-based healthcare, giving patients easy access to their medical information and simplifying IT processes for point-of-care facilities.”

The IEEE EMB Standards Committee (EMB/ Stds Com) is the technical sponsor for all the other standardization development projects.

IEEE P1708 is working to establish guidelines in a standardized way for manufacturers to qualify and validate their products, potential purchasers or users to evaluate and select prospective products, and health care professionals to understand the manufacturing practices on wearable, cuffless blood pressure (BP) devices. The intent is to establish objective performance evaluation of wearable, cuffless BP measuring devices and may be applied for all types of wearable BP measurement devices, regardless of their modes of operation (e.g., to measure short-term, long-term, snapshot, continuous, beat(s)-to-beat(s) BP, or BP variability). The standard is independent of the form of the device or the vehicle to which the device is attached or in which it is embedded.

IEEE P1752 is intended to provide meaningful description, exchange, sharing, and use of mobile health data to support analysis for a set of consumer health, biomedical research, and clinical care needs. The standard will leverage data and nomenclature standards such as the IEEE 11073 family of standards for personal health devices as references, defining specifications for a mobile health data applications programming interface (API) and standardized representations for mobile health data and metadata. Furthermore, IEEE-SA is active on many eHealth related projects, including four standards in development addressing medical 3D printing undertaken by the IEEE 3333.2™ working group.

“The IEEE EMBS members are highly committed, informed, and innovative biomedical engineers from around the world dedicated to finding new discoveries that can advance biomedical technologies through collaboration and translational engineering into product realization in the market place,” said Carole Carey, chair, IEEE Engineering in Medicine and Biology Society Standards Committee. “With the significant growth in the development and innovation of new technologies being applied to health and healthcare systems, the need for developing consensus standards in an open and global environment is essential. EMBS is pleased to host its 39th Annual International Conference, where IEEE EMBS has assembled an extensive agenda and a prestigious list of keynote speakers for sharing best practices and discovery to meet the technological challenges today and for the future.”

hordon kim /

Continue Reading


© Copyright 2006-2018 아이씨엔미래기술센터 All Rights Reserved.
tel. 0505-379-1234, fax. 0505-379-5678 | e-mail. | Powered by WordPress Flex Mag Theme.
Address: 72-31 2F, Changcheon-dong, Seodaemun-gu, Seoul, 03787, Korea
주소: 서울특별시 서대문구 연세로5다길 41, 2층 아이씨엔(우편번호 03787)

제호: 인더스트리얼 커뮤니케이션 네트워크, 등록번호: 서대문-라-00035, 등록일: 2009.04.16, 발행일: 2006.10.01, 발행/편집인: 오승모
발행처: 아이씨엔, 사업자등록번호: 206-11-69466, 대표자: 오승모, 통신판매업신고증: 2009-서울서대문-0373호

기사제보: 반론청구:
본 매체의 모든 콘텐츠는 저작권법의 보호를 받습니다. 이의 일부 또는 전부를 무단 사용하는 것은 저작권법에 저촉되며, 법적 제재를 받을 수 있습니다.