Friday, September 30, 2016

History: Hard Battle Against Dust

ElectronicsWeekly quotes Sony history site on the yield of early CCDs:

"In January 1980, six years and three months after Iwama first gave orders to begin work on the CCD, the world's first CCD camera was produced. However, the production line's yield of viable CCD chips at this stage was poor and only one in several hundred was usable. It took twelve months to manufacture fifty-two CCD chips necessary to manufacture twenty-six cameras. As a result, each CCD was priced at 317,000 yen. Such a CCD would never realize the 50,000 yen target price that Iwama had specified.

"We can't call this a yield rate. It's more like an occurrence rate of usable chips," complained those overseeing the CCD production. The vicious "enemies" undermining their efforts were superfine dust particles that measured less than several microns and were undetectable by human eye. Clean rooms and dust proof attire reminiscent of space suits were introduced at all development and production sites. The battle against small dust was vigorous; people, machines, and every other thinkable source of dust were identified and dealt with.

After all the measures were applied, the yield rate improved enough to implement full mass production from 1983 at the Kokubu Semiconductor plant in Kagoshima.
"

"Once a quality problem affects an OEM customer, it requires a lot of time and effort to regain that customer's trust. Ochi and his team did have a very difficult time selling the CCD chips in 1983 when they first began OEM sales. They visited various camera manufacturers, but were turned away due to a poor track record. Sony had reduced the volume of semiconductors shipped to its customers to meet a surge in internal demand. Ochi and his team started by apologizing for this. The marketing of CCD chips was literally built on the team's sweat and tears."

ADAS Market Report

Woodside Capital Partners published Report on ADAS/Autonomous Sensing Industry. Few slides from the report:

Videantis Takeaways from AutoSens 2016

Videantis posted its review of AutoSens 2016 conference. There are 5 things learned from the conference:
  1. Self-driving cars are hard. After Tesla Autopilot accidents people are more conservative about self-driving prospects. One analyst mentioned 2040 as a possibility for driverless cars to become available, beyond the horizon for most technology companies.
  2. Deep learning is hard. The nets need to be trained with huge amounts of image data, which needs to be properly annotated by hand, with some companies having hundreds of people on staff to perform this task.
  3. Image quality is hard. Image quality will remain a key topic for quite some time.
  4. We need more sensors. OEMs designing 12 cameras into their cars, and this number continuous to go up.
  5. Surround view replacing rear view. These systems are quickly becoming the rear view camera of yesteryear.

Thursday, September 29, 2016

First Visual Innovation Award

Arizona State University: The new Visual Innovation Award has been announced at IEEE International Conference on Image Processing.

Ren Ng, founder of Lytro light-field camera, has been named the first recipient of the Award. Other finalists related to the image sensing field were Achin Bhowmik, Intel VP, development and deployment lead for Intel RealSense camera technology; Brendan Iribe, co-founder and CEO of Oculus, VR; and Alex Kipman, inventor, Microsoft Kinect.

Nvidia on Key Issues in Automotive Imaging

Image Sensors Auto US publishes an interview with Nvidia imaging architect Joshua Wise. Few points from the interview:

Q: What are the key standards issues need to be addressed as components get more complex and diverse?

Joshua Wise: There are two that are top on my list right now. The first that comes to mind is safety and compliance: an important issue in the automotive environment is the ability to self-diagnose issues. In short, the system must “know when it doesn’t know”. As more components enter the ecosystem, there is more opportunity for data to be damaged in transit — and, similarly, more components result in more health data that must be aggregated and transmitted.

The second on my list is the need for a standard in transmitting data with a high dynamic range. There are as many implementations of sending pixels with greater than 12 bits of data as there are vendors right now — perhaps even more! We’ve been working with vendors to come up with solutions that work with both modern and legacy components, but we see an opportunity to unify and standardize here.

Wednesday, September 28, 2016

Keynote on Fast Image Sensors

University of Strasbourg and CNRS Prof. Wilfried Uhring presented his keynote "High Speed Image sensors" at SIGNAL 2016 conference on June 27, 2016 in Lisbon, Portugal. Few slides out of 33:


Talking about the IO speed, 25GPixel/s limitation is somewhat obsolete by now. For example, PCIe 4.0 standard defines 32 lanes of 16Gbps each, with the aggregate bandwidth of 512Gbps. Assuming 10b per pixel, one can get 51GPixel/s I/O speed by just buying the PCIe 4.0-compliant IP. And PCIe 4.0 is not the fastest interface these days.

Challenges in Time Correlated Single Photon Counting Imagers

C. Bruschini and E. Charbon (EPFL and Delft TU) presented "(Challenges in) Time Correlated Single Photon Counting Imagers" at SIGNAL 2016 conference held on June 26-30, 2016 in Lisbon, Portugal. Few slides out of 55:

CEVA Presents its XM6 Embedded Vision Platform

CEVA introduces a new DSP-based platform bringing deep learning and AI capabilities to low-power embedded systems. The new IP platform that is centered around a new XM6 imaging and vision DSP:


Tuesday, September 27, 2016

Rambus LSS Platform

Rambus launches the Partners in Open Development 2.0 (POD 2.0) evaluation platform for its lensless image sensors:



Update: Rambus also publishes an eyetracking use case video:

DJI Mavic Drone Features (Somewhat) Autonomous Flying

SiliconRepublic: DJI Mavic drone uses 5 vision sensors, Movidius Myriad 2 vision processor and 24 processing cores to offer a limited flight autonomy: