Monday, May 06, 2024

Job Postings - Week of 5 May 2024

UC Santa Cruz

Systems Design and Characterization Engineer

Santa Cruz, California, USA

Link

FAPESP - São Paulo Research Foundation

Young Investigator Position in Quantum Technologies

São Paulo, Brazil

Link

Apple

Pixel Development Engineer

Cupertino, California, USA

Link

Meta – Facebook App

Sensor Application Engineer

Sunnyvale, California, USA

Redmond, Washington, USA

Link

University of Houston

Postdoctoral/Senior Research Scientist-X-ray, photon counting detectors

Houston, Texas, USA

Link

IRFU

Staff position in detector physics at CEA/IRFU/DEDIP

Saclay, France

Link

NASA

Development of infrared detectors and focal plane arrays for space instruments

Pasadena, California, USA

Link

Forvia-Faurecia

ADAS Camera Systems Engineer

Northville, Michigan, USA

Link

University of Edinburgh

Sensor and Imaging Systems MSc 

Edinburgh, Scotland, UK

Link

Friday, May 03, 2024

Foveon sensor development "still in design stage"

https://www.dpreview.com/interviews/6004010220/sigma-full-frame-foveon

Full-frame Foveon sensor "still at design stage" says Sigma CEO, "but I'm still passionate"

"Unfortunately, we have not made any significant progress since last year," says Sigma owner and CEO Kazuto Yamaki, when asked about the planned full-frame Foveon camera. But he still believes in the project and discussed what such a camera could still offer.

"We made a prototype sensor but found some design errors," he says: "It worked but there are some issues, so we re-wrote the schematics and submitted them to the manufacturer and are waiting for the next generation of prototypes." This isn't quite a return to 'square one,' but it means there's still a long road ahead.

"We are still in the design phase for the image sensor," he acknowledges: "When it comes to the sensor, the manufacturing process is very important: we need to develop a new manufacturing process for the new sensor. But as far as that’s concerned, we’re still doing the research. So it may require additional time to complete the development of the new sensor."

The Foveon design, which Sigma now owns, collects charge at three different depths in the silicon of each pixel, with longer wavelengths of light able to penetrate further into the chip. This means full-color data can be derived at each pixel location rather than having to reconstruct the color information based on neighboring pixels, as happens with conventional 'Bayer' sensors. Yamaki says the company's thinking about the benefits of Foveon have changed.

"When we launched the SD9 and SD10 cameras featuring the first-generation Foveon sensor, we believed the biggest advantage was its resolution, because you can capture contrast data at every location. Thus we believed resolution was the key." he says: "Today there are so many very high pixel-count image sensors: 60MP so, resolution-wise there’s not so much difference."

But, despite the advances made elsewhere, Yamaki says there's still a benefit to the Foveon design "I’ve used a lot of Foveon sensor cameras, I’ve taken a bunch of pictures, and when I look back at those pictures, I find a noticeable difference," he says. And, he says, this appeal may stem from what might otherwise be seen as a disadvantage of the design.

"It could be color because the Foveon sensor has lots of cross-talk between R, B and G," he suggests: "In contrast, Bayer sensors only capture R, B and G, so if you look at the spectral response a Bayer sensor has a very sharp response for each color, but when it comes to Foveon there’s lots of crosstalk and we amplify the images. There’s lots of cross-talk, meaning there’s lots of gradation between the colors R, B and G. When combined with very high resolution and lots of gradation in color, it creates a remarkably realistic, special look of quality that is challenging to describe."

The complexity of separating the color information that the sensor has captured is part of what makes noise such a challenge for the Foveon design, and this is likely to limit the market, Yamaki concedes:
"We are trying to make our cameras with the Foveon X3 sensor more user-friendly, but still, compared to the Bayer sensor cameras, it won’t be easy to use. We’re trying to improve the performance, but low-light performance can’t be as good as Bayer sensor. We will do our best to make a more easy-to-use camera, but still, a camera with Foveon sensor technology may not be the camera for everybody."

But this doesn't dissuade him. "Even if we successfully develop a new X3 sensor, we may not be able to sell tons of cameras. But I believe it will still mean a lot," he says: "despite significant technology advancements there hasn't been much progress in image quality in recent years. There’s a lot of progress in terms of burst rate or video functionality, but whe
n you talk just about image quality, about resolution, tonality or dynamic range, there hasn’t been so much progress."

"If we release the Foveon X3 sensor today and people see the quality, it means a lot for the industry, that’s the reason I’m still passionate about the project."

Wednesday, May 01, 2024

Nexchip mass produces 55nm and 90nm BSI CIS

Google translation of a news article:

Jinghe integrates 50-megapixel image sensors into mass production and plans to double its CIS production capacity within the year

According to Jinghe Integration news, after the mass production of 90nm CIS and 55nm stacked CIS, Jinghe Integration (688249) CIS has added new products. Recently, Jinghe's integrated 55nm single-chip, 50-megapixel back-illuminated image sensor (BSI) has entered mass production, greatly empowering different application scenarios of smartphones and achieving a leapfrog move from mid- to low-end to mid-to-high-end applications. Jinghe Integration plans to see a doubling of CIS production capacity this year, and its share of shipments will increase significantly, becoming the second largest product axis after display driver chips.

Nexchip's website shows the following technologies.

 https://www.nexchip.com.cn/en-us/Service/Roadmap


Sunday, April 28, 2024

Airy3D - Teledyne e2v collaboration

Link: https://www.airy3d.com/airy3d-e2v-collaboration/

Teledyne e2v and Airy3D collaboration delivers more affordable 3D vision solutions 

Grenoble, FRANCE, April 23, 2024 —Teledyne e2v, a Teledyne Technologies [NYSE: TDY] company and global innovator of imaging solutions, is pleased to announce a new technology and design collaboration with Airy3D (Montreal, Canada), a leading 3D vision solution provider. The first result of this partnership is the co-engineering of the recently announced Topaz5D™, a low-cost, low power, passive, 2 megapixel global shutter sensor which produces 2D images and 3D depth maps.

Arnaud Foucher, Business Team Director at Teledyne e2v, said, “We’re very excited to have collaborated with Airy3D on the development of Topaz5D, our latest unique CMOS sensor. The need to deploy alternative 3D vision solutions in different industries is crucial. Teledyne e2v’s image sensor design capability coupled with Airy3D’s proven 3D technology has allowed us to develop more 3D vision products for several market segments with a reduced cost of ownership.”

Chris Barrett, CEO of Airy3D, commented, “Airy3D uniquely combines our patented optically Transmissive Diffraction Mask (TDM) design and deep software processing know-how, enabling our partners to add value to their products. Teledyne e2v’s image sensor design, production and supply chain expertise are paramount in introducing these novel 3D solutions to the market and this initiative is a key milestone for us.”

A Topaz5D Evaluation Kit and monochrome and color sensor samples are available now for evaluations and design. Please contact Teledyne e2v for more information.

Friday, April 26, 2024

Lecture on Noise in Event Cameras and "SciDVS" camera

 


Talk title:   "Noise Limits of Event Cameras" presented at the Cambridge Huawei Frontiers in image Sensing 2024

Speaker: Prof. Tobi Delbruck

Abstract: "Cameras that mimic biological eyes have a 50 year history and the DVS event camera pixel is now nearly 20 years old. Event camera academic and industrial development is active, but it is only in the last few years that we understand more about the ultimate limits on their noise performance. This talk will be about those limits: What are the smallest changes that we can detect at a particular light intensity and particular speed? What are the main sources of noise in event cameras and what are the limits on these? I will discuss these results in the context of our PhD student Rui Graca’s work on SciDVS, a large-pixel DVS that targets scientific applications such as neural imaging and space domain awareness."

Wednesday, April 24, 2024

A review of event cameras for automotive applications

Event Cameras in Automotive Sensing: A Review
Shariff et al.
IEEE Access

DOI: https://doi.org/10.1109/ACCESS.2024.3386032

Abstract:
Event cameras (EC) represent a paradigm shift and are emerging as valuable tools in the automotive industry, particularly for in-cabin and out-of-cabin monitoring. These cameras capture pixel intensity changes as ”events” with ultra-low latency, making them suitable for real- time applications. In the context of in-cabin monitoring, EC offer solution for driver and passenger tracking, enhancing safety and comfort. For out-of-cabin monitoring, they excel in tracking objects and detecting potential hazards on the road. This article explores the applications, benefits, and challenges of event cameras in these two critical domains within the automotive industry. This review also highlights relevant datasets and methodologies, enabling researchers to make informed decisions tailored to their specific vehicular-technology and place their work in the broader context of EC sensing. Through an exploration of the hardware, the complexities of data processing, and customized algorithms for both in-cabin and out-of-cabin surveillance, this paper outlines a framework encompassing methodologies, tools, and datasets critical for the implementation of event camera sensing in automotive systems. 














Two more Sony job openings - Belgium, this time

Sony Depthsensing Solutions      Brussels, Belgium

Analog Design Manager     Link

IC Validation and Verification Engineer     Link

Tuesday, April 23, 2024

TriEye and Vertilas 1.3μm VCSEL-Driven SWIR Sensing Solutions

TriEye and Vertilas Partner to Demonstrate 1.3μm VCSEL-Driven SWIR Sensing Solutions

TEL AVIV, Israel, April 16, 2024/ – TriEye, pioneer of the world's first cost-effective mass-market Short-Wave Infrared (SWIR) sensing technology, and Vertilas GmbH, a leader in InP VCSEL products, announced today the joint demonstration of a 1.3μm VCSEL-powered SWIR sensing system.

TriEye and Vertilas announce their collaboration in advanced imaging technology. This partnership has led to the development of a technology demonstrator that integrates TriEye's state-of-the-art Short-Wave Infrared (SWIR) Raven image sensor with Vertilas’ innovative Indium Phosphide (InP) Vertical-Cavity Surface-Emitting Laser (VCSEL) technology. Adopting high-volume, scalable manufacturing strategies, these technologies provide cost-effective solutions for both consumer and industrial
markets.

The system highlights the capabilities of TriEye's CMOS-based SWIR sensor, noted for its high sensitivity and 1.3MP resolution. Designed to enhance imaging in various industries, including automotive, consumer, biometrics, and mobile robots, this solution represents a significant step forward in sensing technology. Alongside, Vertilas introduces its InP SWIR VCSEL technology that provides high output power with high power efficiency. This new VCSEL technology is a complementary innovation that enhances the SWIR camera's functionality. Deploying 1.3μm VCSEL arrays enables greatly improved eye safety and signal quality while minimizing sunlight distortion. Vertilas InP VCSEL array technology also
offers wavelengths at 1.55μm up to 2μm. This new technology is expected to broaden the scope of applications in imaging and illumination across multiple industries.

"Vertilas is thrilled to expand our efforts with TriEye in this groundbreaking initiative. Our InP VCSEL technology, combined with TriEye's exceptional SWIR sensor, marks a significant advancement in the realm of imaging and illumination solutions”, said Christian Neumeyr, CEO at Vertilas. “This collaboration is more than just a technological achievement; it represents our shared vision of innovating for a better, more efficient future in both consumer and industrial applications."

"At TriEye, our commitment has always been to bring revolutionary SWIR technology to the forefront of the market. The integration of our SWIR sensor with Vertilas InP VCSEL technology in this collaborative venture is a testament to this mission”, said Avi Bakal, CEO of TriEye. “We are proud to unveil a solution that not only enhances imaging capabilities across various industries but also does so in a cost-effective and scalable manner, making advanced sensing technology more accessible than ever."

Monday, April 22, 2024

Camera identification from retroreflection signatures

In a recent article in Optics Express titled "Watching the watchers: camera identification and characterization using retro-reflections,", Seets et al. from University of Wisconsin-Madison write:

A focused imaging system such as a camera will reflect light directly back at a light source in a retro-reflection (RR) or cat-eye reflection. RRs provide a signal that is largely independent of distance providing a way to probe cameras at very long ranges. We find that RRs provide a rich source of information on a target camera that can be used for a variety of remote sensing tasks to characterize a target camera including predictions of rotation and camera focusing depth as well as cell phone model classification. We capture three RR datasets to explore these problems with both large commercial lenses and a variety of cell phones. We then train machine learning models that take as input a RR and predict different parameters of the target camera. Our work has applications as an input device, in privacy protection, identification, and image validation.

 Link: https://opg.optica.org/oe/fulltext.cfm?uri=oe-32-8-13836&id=548474