Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday February 11 2020, @11:56AM   Printer-friendly
from the nothing-quite-like-first-light dept.

Arthur T Knackerbracket has found the following story:

The tension was high: In front of a large screen at the house near Madrid where members of the Consortium participating in the commissioning of the satellite live, as well as at the other institutes involved in CHEOPS, the team waited for the first images from the space telescope. "The first images that were about to appear on the screen were crucial for us to be able to determine if the telescope's optics had survived the rocket launch in good shape," explains Willy Benz, Professor of Astrophysics at the University of Bern and Principal Investigator of the CHEOPS mission. "When the first images of a field of stars appeared on the screen, it was immediately clear to everyone that we did indeed have a working telescope," says Benz happily. Now the remaining question is how well it is working.

Preliminary analysis has shown that the images from CHEOPS are even better than expected. However, better for CHEOPS does not mean sharper as the telescope has been deliberately defocused. This is because spreading the light over many pixels ensures that the spacecraft's jitter and the pixel-to-pixel variations are smoothed out, allowing for better photometric precision.

"The good news is that the actual blurred images received are smoother and more symmetrical than what we expected from measurements performed in the laboratory," says Benz. High precision is necessary for CHEOPS to observe small changes in the brightness of stars outside our solar system caused by the transit of an exoplanet in front of the star. Since these changes in brightness are proportional to the surface of the transit planet, CHEOPS will be able to measure the size of the planets. "These initial promising analyses are a great relief and also a boost for the team," continues Benz.

How well CHEOPS is working will be tested further over the next two months. "We will analyze many more images in detail to determine the exact level of accuracy that can be achieved by CHEOPS in the different aspects of the science program," says David Ehrenreich, CHEOPS project scientist at the University of Geneva. "The results so far bode well," said Ehrenreich.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Informative) by nitehawk214 on Tuesday February 11 2020, @09:40PM (2 children)

    by nitehawk214 (1304) on Tuesday February 11 2020, @09:40PM (#956978)

    It is a lot more complicated than that. Using a higher resolution camera doesn't necessarily mean higher resolution images. By taking lots of pictures and doing math, you can enhance the useful resolution.

    Scientists working Hubble invented DRIZZLE [wikipedia.org] to cope with this. The final image is a higher resolution than the actual camera. It can't be higher than the angular resolution of the telescope itself, however.

    Also CHEOPS isn't up there to take pictures, it is searching for planets using the transit technique. It's the dimming of the star that matters, not it's angular resolution. Stars are pinpoint sources, defocusing them to multiple pixels lets them average out the noise and ultimately get a better signal.

    --
    "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
    Starting Score:    1  point
    Moderation   +1  
       Informative=1, Total=1
    Extra 'Informative' Modifier   0  

    Total Score:   2  
  • (Score: 2) by FatPhil on Wednesday February 12 2020, @09:38AM (1 child)

    by FatPhil (863) <pc-soylentNO@SPAMasdf.fi> on Wednesday February 12 2020, @09:38AM (#957144) Homepage
    Yes, and defocussing can be done in DSP, rather than mechanically. The science has been around for many many decades. But a choice to defocus differently cannot be done if done mechanically, rather than in DSP. It simply sounds like they didn't have a DSP engineer in the team, and only had astronomers.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 1, Insightful) by Anonymous Coward on Wednesday February 12 2020, @03:17PM

      by Anonymous Coward on Wednesday February 12 2020, @03:17PM (#957200)

      I suggest you work up an error budget and get smart on interpixel and sampling noise. It is completely asinine to suggest gettting around pixel and electronic noise by using a DSP. Putting aside the effectiveness (or lack thereof) of reducing your noise, why the hell would you want to do a whole DSP and code development with all the preflight hardware and software development and certification and testing when you can just put in a slight defocus and get better results?

      Based on this and some of your other comments, I really don't think you understand why they are putting defocus there in the first place.