Arthur T Knackerbracket has found the following story:
The tension was high: In front of a large screen at the house near Madrid where members of the Consortium participating in the commissioning of the satellite live, as well as at the other institutes involved in CHEOPS, the team waited for the first images from the space telescope. "The first images that were about to appear on the screen were crucial for us to be able to determine if the telescope's optics had survived the rocket launch in good shape," explains Willy Benz, Professor of Astrophysics at the University of Bern and Principal Investigator of the CHEOPS mission. "When the first images of a field of stars appeared on the screen, it was immediately clear to everyone that we did indeed have a working telescope," says Benz happily. Now the remaining question is how well it is working.
Preliminary analysis has shown that the images from CHEOPS are even better than expected. However, better for CHEOPS does not mean sharper as the telescope has been deliberately defocused. This is because spreading the light over many pixels ensures that the spacecraft's jitter and the pixel-to-pixel variations are smoothed out, allowing for better photometric precision.
"The good news is that the actual blurred images received are smoother and more symmetrical than what we expected from measurements performed in the laboratory," says Benz. High precision is necessary for CHEOPS to observe small changes in the brightness of stars outside our solar system caused by the transit of an exoplanet in front of the star. Since these changes in brightness are proportional to the surface of the transit planet, CHEOPS will be able to measure the size of the planets. "These initial promising analyses are a great relief and also a boost for the team," continues Benz.
How well CHEOPS is working will be tested further over the next two months. "We will analyze many more images in detail to determine the exact level of accuracy that can be achieved by CHEOPS in the different aspects of the science program," says David Ehrenreich, CHEOPS project scientist at the University of Geneva. "The results so far bode well," said Ehrenreich.
(Score: 2) by takyon on Tuesday February 11 2020, @01:23PM (11 children)
I didn't know about the blurriness aspect. That's a neat trick.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Informative) by FatPhil on Tuesday February 11 2020, @02:05PM (10 children)
Defocussing is a horribly unpleasant thing to undo, it's pretty much the least reversible operation you can do to data (its inverse convolution matrix can go to infinity), so it can be considered an information-losing operation.
So why would you do that - deliberately throw away information at source?
If you want blurred data - get the cleanest data you can, and then apply precisely the blur you want to that clean data. Who knows - you might change the level of blur you want - you can only do that if you can go back to the raw data.
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 2) by FatPhil on Tuesday February 11 2020, @02:21PM (2 children)
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 1, Insightful) by Anonymous Coward on Tuesday February 11 2020, @03:13PM (1 child)
If someone submits a paper where they are using Photoshop to do photometry, I would move to have them scientifically disbarred. You can keep your backyard astronomy tricks to make pretty pictures. If I'm trying to really figure out the number of photons I'm capturing, I want my point spread function spread over multiple pixels.
And the Internet is filled with all sorts of imagery crap of people taking Mars rover images and cycling through Photoshop zoom and sharpen buttons and seeing all sorts of interesting effects. Photoshop will happily keep interpolating and resampling all the way down past the point where your original pixel is larger than your image.
(Score: 2) by Osamabobama on Tuesday February 11 2020, @07:41PM
Headline: Amateur xenoarchaeologist finds amazing artifacts in Mars image
Appended to the end of comments you post. Max: 120 chars.
(Score: 2, Insightful) by Anonymous Coward on Tuesday February 11 2020, @02:37PM (1 child)
> I think it's broken.
Hmmm, who do I believe? You (FP) shooting from the hip, or the scientists and engineers that specially designed this space telescope for their very specific purpose.
Hint -- star photography/photometry is not like normal/terrestrial photography.
(Score: 2) by FatPhil on Wednesday February 12 2020, @09:41AM
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 2, Informative) by Anonymous Coward on Tuesday February 11 2020, @03:26PM (1 child)
They do not intend to undo the blur. It is there by design. They are not looking for high spatial frequency information, they are looking for photometric changes over long observational times. The expected noise from a spot moving back and forth over multiple pixels due to spacecraft jitter is larger than smearing the signal over a larger number of pixels and having the intensity distribution change a bit, but to largely stay on the same pixels.
If you want more information, check out sections 4.4.3 (Defocused PSFs performance) and 4.6 (Noise Budget) [esa.int].
(Score: 2) by FatPhil on Wednesday February 12 2020, @09:40AM
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 2, Informative) by nitehawk214 on Tuesday February 11 2020, @09:40PM (2 children)
It is a lot more complicated than that. Using a higher resolution camera doesn't necessarily mean higher resolution images. By taking lots of pictures and doing math, you can enhance the useful resolution.
Scientists working Hubble invented DRIZZLE [wikipedia.org] to cope with this. The final image is a higher resolution than the actual camera. It can't be higher than the angular resolution of the telescope itself, however.
Also CHEOPS isn't up there to take pictures, it is searching for planets using the transit technique. It's the dimming of the star that matters, not it's angular resolution. Stars are pinpoint sources, defocusing them to multiple pixels lets them average out the noise and ultimately get a better signal.
"Don't you ever miss the days when you used to be nostalgic?" -Loiosh
(Score: 2) by FatPhil on Wednesday February 12 2020, @09:38AM (1 child)
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 1, Insightful) by Anonymous Coward on Wednesday February 12 2020, @03:17PM
I suggest you work up an error budget and get smart on interpixel and sampling noise. It is completely asinine to suggest gettting around pixel and electronic noise by using a DSP. Putting aside the effectiveness (or lack thereof) of reducing your noise, why the hell would you want to do a whole DSP and code development with all the preflight hardware and software development and certification and testing when you can just put in a slight defocus and get better results?
Based on this and some of your other comments, I really don't think you understand why they are putting defocus there in the first place.