Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Wednesday September 21 2022, @08:35AM   Printer-friendly
from the buh-bye-night-sky dept.

The imminent launch of a BlueWalker satellite, with a giant phased array antenna, portends a brightening night sky:

The prototype of a new constellation of extremely bright Earth-orbiting satellites is due to launch in early- to mid-September. The AST SpaceMobile company plans to orbit more than 100 of these spacecraft by the end of 2024. Astronomers at the Vera Rubin Observatory and the International Astronomical Union's Centre for the Protection of Dark and Quiet Skies from Satellite Constellation Interference (IAU CPS) are concerned because these new spacecraft will interfere with celestial observations, adding to the problems already caused by other constellations.

The first member of this new group, called BlueWalker 3, will feature a giant antenna array covering an area of 64 square meters (689 square feet). Observers on the ground will see bright sunlight reflected from this structure. After on-orbit tests of BlueWalker 3 are completed, the operational satellites, called BlueBirds, will be launched. BlueBirds may produce even more glaring light pollution since they are significantly larger. The commercial appeal of these satellites is that they will link directly to cell phones without the need of a cell tower. AST SpaceMobile has already secured a license from the Federal Communications Commission to test the prototype.

[...] Other bright satellites are waiting in the wings: 30,000 second-generation Starlink satellites are currently awaiting FCC approval. Like the BlueBirds, the new Starlinks may carry antennas for direct connection to cell phones; the antennas are slightly smaller at "only" 25 square meters, but the satellites would be far more numerous than the BlueBird constellation. That development would be very bad news for astronomy.

BlueWalker 3 is expected to be among the brightest objects in the night sky after the antenna unfolds. Amateur astronomers can help record this satellite's brightness, bringing awareness to bright satellites' effects on our night sky and on astronomy.

[...] Astrophotographers can also play an important role in the study of artificial satellites, by uploading celestial images impacted by satellite streaks to the TrailBlazer website. Meredith Rawls and Dino Bektešević (both at University of Washington) are developing this data archive as part of the IAU's response to the problems posed by spacecraft. Trailblazer stores the impacted images and records selected metadata, so users can search for satellite-streaked images by date, location, and other parameters such as sky position and telescope.

See also:
    AST SpaceMobile video describing the phased array satellite.
    NASA APOD showing satellite streaks over a two hour period.

Previously:
    SpaceX Has Had 'Promising Conversations' With Apple About iPhone Satellite Service
    AST SpaceMobile Gets US Approval to Test Satellite-based Cellular Broadband


Original Submission

 
This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by Immerman on Wednesday September 21 2022, @02:24PM (5 children)

    by Immerman (3985) on Wednesday September 21 2022, @02:24PM (#1272768)

    You seem to be under the impression that long exposures are only necessary for chemical film?

    That's hardly true - you need long exposures for digital sensors as well. Digital sensors suffer from noise - like light static overlayed on the image. It's obvious in photos from cheap cameras, and even with a good camera it will be obvious in a photo taken in a pitch black room. And the higher the pixel density on the sensor, the worse the noise.

    If you're imaging a dim object (and that's pretty much everything that hasn't been studied to death already) you need to collect light for an extended period for the tiny amount of light hitting a pixel on the sensor to accumulate to the point that it substantially exceeds the noise floor. Video won't cut it - by reading the sensor you reset the accumulated data, and no amount of multi-frame processing will make up for throwing away your data before it accumulates enough to stand clear of the noise.

    Starting Score:    1  point
    Moderation   +2  
       Interesting=1, Informative=1, Total=2
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 2) by JoeMerchant on Wednesday September 21 2022, @03:56PM

    by JoeMerchant (3937) on Wednesday September 21 2022, @03:56PM (#1272795)

    The question is: what is that optimal exposure time?

    Long exposures for CMOS sensors might be 10 seconds, and if you get a thin streak you don't necessarily have to throw out the whole frame.

    In practice, high investment images these days are composed of thousands of exposures, not 1/60th of a second each to be sure, but thousands none the less.

    These images full of streaks are intentionally emphasizing the issue without applying any mitigation.

    --
    🌻🌻 [google.com]
  • (Score: 2) by JoeMerchant on Wednesday September 21 2022, @05:08PM (3 children)

    by JoeMerchant (3937) on Wednesday September 21 2022, @05:08PM (#1272813)

    >no amount of multi-frame processing will make up for throwing away your data before it accumulates enough to stand clear of the noise.

    Actually, you can take 10,000 frames which all appear to be nothing but random noise, average them, and improve the signal to noise ratio by: can you guess? 10,000:1.

    If you're looking for a (repetitive) cardiac signal on the body, you can do some virtually magical signal processing by ensemble averaging - triggered on the R wave from the EKG. Same thing with stars in the sky: they're constant, even if their signal is only 1% of your noise floor, by capturing 10,000 noisy images in which their +0.01 strength signal is superimposed over that +/- 1.0 strength range noise, the average of those 10,000 images will bring your star out 100x brighter than the noise, and the darker sky between the stars will come up dark in the average of the noise (or, if your sensor is biased you'll get a near uniform background value which you can then subtract out...)

    Do satellite flares diminish image quality? Yes, ever-so-slightly, even when removed by clever editing techniques. Can you even notice the lower image quality in a properly processed final image? Nope. The headline photos showing satellite streaks are emphasizing the problem, not showing its true impact.

    --
    🌻🌻 [google.com]
    • (Score: 1, Informative) by Anonymous Coward on Wednesday September 21 2022, @05:48PM (2 children)

      by Anonymous Coward on Wednesday September 21 2022, @05:48PM (#1272823)

      Beating down the noise for independent and identically distributed (IID) distributions goes as the square root of the number of samples, so your 10,000 frames would (in principle) knock down your noise by a factor of 100, not 10,000. However, frame averaging only gets you so far when you consider the other sensor noise sources, such as the read noise, fixed pattern noise, etc.. You quickly break the IID assumption when you start changing your background signal because now you're getting variable shot noise thrown in. There are ways to approach all of these things, and you end up with an associated residual error in all of these corrections that end up getting added in quadrature. Assuming you can do a decent job of co-aligning all of your frames, when you rack and stack them it will certainly look much better for it, but you'll find the useful bottom end on your signal is going to be much higher than you might think it should be.

      Everything mentioned so far is great for single band imagery, but if you are using a filter set, now you have the issue where your streak is going to be in different locations for each filter color you use further challenging trying to extract multi-spectral information where you are assuming your combined color image is representative of the scene.

      • (Score: 2) by JoeMerchant on Wednesday September 21 2022, @08:17PM (1 child)

        by JoeMerchant (3937) on Wednesday September 21 2022, @08:17PM (#1272862)

        All good points, and my reference for ensemble averaging is actually in the area of time series data containing cardiac ventricular volumes buried underneath respiratory motions. We would average about 60 heartbeats worth of signal and the cardiac signal would indeed grow to 60x its previous size relative to the (more or less) uncorrelated respiratory signal. There are actually correlations between when your heart beats and your respiratory cycle, but the correlations weren't strong enough to show up in an average of 60 heartbeats.

        We tried all kinds of crazy signal processing ideas over the years, most were pretty meh when applied to real world data, but ensemble averaging exceeded expectations every time we found an application for it, really dramatically good results - with the inherent limitations of the method: your signal has to be repeating, noise at least mostly uncorrelated, what you are getting isn't an instantaneous read but a read of the averaged samples, etc.

        --
        🌻🌻 [google.com]
        • (Score: 0) by Anonymous Coward on Wednesday September 21 2022, @10:26PM

          by Anonymous Coward on Wednesday September 21 2022, @10:26PM (#1272903)

          That sounds like something akin to a lock-in amplifier technique where you're using the heartbeat as your oscillator and contributions from anything not correlated with your oscillator attenuate out. It is a wonderful technique, but doesn't seem to be very well known amongst even many of my colleagues who work in labs. There are some techniques like this where I understand the math behind it, and even worked through derivations of the math, but when you see it off the paper and working in a lab do still feel a bit like magic. Especially this going back and forth between the time and frequency domains (or pixel and spatial frequency domains).