Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Wednesday September 21 2022, @08:35AM   Printer-friendly
from the buh-bye-night-sky dept.

The imminent launch of a BlueWalker satellite, with a giant phased array antenna, portends a brightening night sky:

The prototype of a new constellation of extremely bright Earth-orbiting satellites is due to launch in early- to mid-September. The AST SpaceMobile company plans to orbit more than 100 of these spacecraft by the end of 2024. Astronomers at the Vera Rubin Observatory and the International Astronomical Union's Centre for the Protection of Dark and Quiet Skies from Satellite Constellation Interference (IAU CPS) are concerned because these new spacecraft will interfere with celestial observations, adding to the problems already caused by other constellations.

The first member of this new group, called BlueWalker 3, will feature a giant antenna array covering an area of 64 square meters (689 square feet). Observers on the ground will see bright sunlight reflected from this structure. After on-orbit tests of BlueWalker 3 are completed, the operational satellites, called BlueBirds, will be launched. BlueBirds may produce even more glaring light pollution since they are significantly larger. The commercial appeal of these satellites is that they will link directly to cell phones without the need of a cell tower. AST SpaceMobile has already secured a license from the Federal Communications Commission to test the prototype.

[...] Other bright satellites are waiting in the wings: 30,000 second-generation Starlink satellites are currently awaiting FCC approval. Like the BlueBirds, the new Starlinks may carry antennas for direct connection to cell phones; the antennas are slightly smaller at "only" 25 square meters, but the satellites would be far more numerous than the BlueBird constellation. That development would be very bad news for astronomy.

BlueWalker 3 is expected to be among the brightest objects in the night sky after the antenna unfolds. Amateur astronomers can help record this satellite's brightness, bringing awareness to bright satellites' effects on our night sky and on astronomy.

[...] Astrophotographers can also play an important role in the study of artificial satellites, by uploading celestial images impacted by satellite streaks to the TrailBlazer website. Meredith Rawls and Dino Bektešević (both at University of Washington) are developing this data archive as part of the IAU's response to the problems posed by spacecraft. Trailblazer stores the impacted images and records selected metadata, so users can search for satellite-streaked images by date, location, and other parameters such as sky position and telescope.

See also:
    AST SpaceMobile video describing the phased array satellite.
    NASA APOD showing satellite streaks over a two hour period.

Previously:
    SpaceX Has Had 'Promising Conversations' With Apple About iPhone Satellite Service
    AST SpaceMobile Gets US Approval to Test Satellite-based Cellular Broadband


Original Submission

 
This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by JoeMerchant on Wednesday September 21 2022, @10:49AM (25 children)

    by JoeMerchant (3937) on Wednesday September 21 2022, @10:49AM (#1272718)

    I have some sad news for anyone who is all set to get outraged about more satellites in the sky... Serious astronomy already needs (and has) digital processing solutions for satellite streaks removal. It's about a million times easier to remove a satellite streak from a multiple exposure than it is to compensate for atmospheric distortions, and they have been doing atmospheric distortion compensation for decades.

    If your backyard Tasco into your cellphone camera setup doesn't have sat streak removal in the software stack, my condolences, iOS is a bitch to develop for.

    If you are old school and do long exposures on chemical film, A) you are a rare bird indeed, and B) your observation window has been reduced to those times of night when LEO is in the Earth's shadow.

    With digital processing: id the affected pixels, which is insanely easy to do with an automatic algorithm, and remove them from the average, which is the kind of math they teach in the 5th grade...

    If you need to get some rage out about light pollution, try your local terrestrial sources, starting with any you may leave on overnight outside your home.

    --
    🌻🌻 [google.com]
    Starting Score:    1  point
    Moderation   +2  
       Informative=2, Disagree=1, Total=3
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 4, Insightful) by maxwell demon on Wednesday September 21 2022, @11:20AM (7 children)

    by maxwell demon (1608) on Wednesday September 21 2022, @11:20AM (#1272725) Journal

    Yeah, sure, all those professional astronomers who complain have absolutely no clue on how to do astronomy.</sarcasm>

    --
    The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 2, Interesting) by khallow on Wednesday September 21 2022, @11:38AM (1 child)

      by khallow (3766) Subscriber Badge on Wednesday September 21 2022, @11:38AM (#1272727) Journal
      Wouldn't be the first time. A glaring example of this is an institutional emphasis on R&D for spacecraft, but not for doing actual science with those spacecraft. That would require some basic engineering economics.

      It's also possibly a move by SpaceX competitors to undermine Starlink and similar efforts. The complaints could be legit, but the visibility of those complaints may be getting some help.
      • (Score: 3, Touché) by JoeMerchant on Wednesday September 21 2022, @01:29PM

        by JoeMerchant (3937) on Wednesday September 21 2022, @01:29PM (#1272751)

        >the TrailBlazer website. Meredith Rawls and Dino Bektešević (both at University of Washington) are developing this data archive as part of the IAU's response to the problems posed by spacecraft.

        Check their funding, if you get the real answers I bet the connections are crystal clear.

        --
        🌻🌻 [google.com]
    • (Score: 3, Interesting) by JoeMerchant on Wednesday September 21 2022, @12:21PM (4 children)

      by JoeMerchant (3937) on Wednesday September 21 2022, @12:21PM (#1272736)

      Ragers gonna rage. I find that it's a very rare professional astronomer complaining about this, mostly it seems to be journalists fanning the flames and people who have never processed a multiple exposure image who get all outraged in the responses.

      Very first Google result:

      https://www.astropixelprocessor.com/community/main-forum/removing-satellite-trails/ [astropixelprocessor.com]

      Two years ago that guy had been removing satellite trails for a long time and the software update he applied obscured the method for him, for a minute.

      --
      🌻🌻 [google.com]
      • (Score: 2) by maxwell demon on Thursday September 22 2022, @08:37AM (3 children)

        by maxwell demon (1608) on Thursday September 22 2022, @08:37AM (#1272956) Journal

        I can't find any indication that this guy is a professional astronomer rather than an amateur.

        Note that professional astronomers are not simply after pretty pictures, they are after scientific data. Note also that astronomical processes are also time dependent. If you are trying to measure a time-dependent signal, averaging over a longer time won't do you any good.

        And “Astronomers at the Vera Rubin Observatory” surely aren't journalists. Of course it's a journalist bringing you their message; that's the journalist's job.

        --
        The Tao of math: The numbers you can count are not the real numbers.
        • (Score: 2) by JoeMerchant on Thursday September 22 2022, @10:35AM (2 children)

          by JoeMerchant (3937) on Thursday September 22 2022, @10:35AM (#1272962)

          Link was intended to be to the readily available amateur software that solved the problem for amateurs years ago.

          Vera Rubin observatory has not taken a single image yet and won't until next year. Why wouldn't the reporter take statements from astronomers at the two working telescopes at the same location?

          Overall, considering the data from Webb and Hubble, would you consider astronomy to be getting more good data today, or in 1990 when there was less, but not zero, interference from manmade orbiting stuff?

          --
          🌻🌻 [google.com]
          • (Score: 2) by maxwell demon on Thursday September 22 2022, @12:55PM (1 child)

            by maxwell demon (1608) on Thursday September 22 2022, @12:55PM (#1272979) Journal

            And the thought never occurred to you that professional astronomers might have different goals and therefore different needs than amateur astronomers?

            Anyway, I'll just file you under Dunning-Kruger and no longer waste my time arguing with you.

            --
            The Tao of math: The numbers you can count are not the real numbers.
            • (Score: 2) by JoeMerchant on Thursday September 22 2022, @02:10PM

              by JoeMerchant (3937) on Thursday September 22 2022, @02:10PM (#1272984)

              No argument, but consider this hypothetical:

              “Astronomers at the Vera Rubin Observatory” https://www.lsst.org/ [lsst.org] got a big fat grant for construction and operation, and as opening day (first light) looms their dreams of becoming the next NdGT slowly fade as Chilean schoolchildren touring the inactive observatory yawn and requests for interviews are few, far between, and from very minor outlets. However, the new science they are supposed to be doing, which got them the grant in the first place, means that existing algorithms for dealing with satellite interference don't completely solve all their problems. So, the television interviewing skills they have been practicing are getting little use, but before they can publish that first big discovery from the new instrument (their other life's dream), they're going to have to slog through database maintenance and satellite tracking algorithms doing original work in areas mostly only useful to themselves, work that really doesn't fulfill any of their hopes or dreams in and of itself.

              So, while the frustrated astrophysicists are slogging through the unpleasant but necessary drudge, ersatz reporter contacts - nudged by interests wanting to diminish the popularity of new satellite swarm services - relay the researchers' sour grapes grumblings about these satellites, feeding the world of people eager to have something more "human scale" to hate on besides the too scary prospects of nuclear winter, climate apocalypse, AI Armageddon, etc.

              --
              🌻🌻 [google.com]
  • (Score: 1, Interesting) by Anonymous Coward on Wednesday September 21 2022, @11:40AM (10 children)

    by Anonymous Coward on Wednesday September 21 2022, @11:40AM (#1272728)

    id the affected pixels, which is insanely easy to do with an automatic algorithm

    What is this algorithm that can do this? This sounds like a similar problem to identifying cosmic ray streaks in an imager, and the last time I looked into that (which was a couple few years ago), this was not a trivial problem at all, or at least, clever ideas were worthy of being written up as journal articles.

    • (Score: 0) by Anonymous Coward on Wednesday September 21 2022, @11:54AM (1 child)

      by Anonymous Coward on Wednesday September 21 2022, @11:54AM (#1272733)

      This sounds like a similar problem to identifying cosmic ray streaks in an imager

      cosmic ray streaks are ransom. Satellites follow a precise path.

      • (Score: 0) by Anonymous Coward on Wednesday September 21 2022, @12:26PM

        by Anonymous Coward on Wednesday September 21 2022, @12:26PM (#1272737)

        and Ray Stevens is handsom

    • (Score: 3, Informative) by JoeMerchant on Wednesday September 21 2022, @01:28PM (7 children)

      by JoeMerchant (3937) on Wednesday September 21 2022, @01:28PM (#1272750)

      >What is this algorithm that can do this?

      Bright thing, not in FOV, then enters and moves across at anything near "satellite speed" which is considerably slower than cosmic rays, erase it, and pixels close to it, and maybe following it on subsequent frames if it's bright enough to dazzle the sensor.

      You don't need to, but I bet the big boys actually have orbital schedules for known objects, know 99.999%+ of the objects that would interfere with the images they are capturing, and can remove them without even looking for bright streaks in the image - they just know where to expect them. That's a big database to maintain, but one copy of that database serves the entire community. Also, this isn't too far off from how new object discovery is done. The satellites move faster than planetisimals, asteroids and other known objects, but it's basically the same computations.

      --
      🌻🌻 [google.com]
      • (Score: 5, Informative) by Immerman on Wednesday September 21 2022, @02:26PM (5 children)

        by Immerman (3985) on Wednesday September 21 2022, @02:26PM (#1272769)

        Quick sanity check:
        From what I can find a typical amateur telescope has a field of view of about 15', or 1/14,400th of a great circle.

        A typical LEO satellite has an orbital period of about 90 minutes, so it will be in frame for at most 6 seconds. Professional telescopes tend to have a much smaller FoV (=much higher magnification), so the transit time for them will be much shorter.

        I don't do astronomy, but it seems to me 6 seconds is a pretty short exposure time if you want to be able to see *anything* more than the brightest stars. So you're unlikely to get any sense of motion from a satellite - you'll just see a bright streak across the frame - and the internal reflections of that light, which is orders of magnitude brighter than everything else in the sky, will likely drown out anything else of interest in that frame.

        • (Score: 1, Informative) by Anonymous Coward on Wednesday September 21 2022, @02:35PM

          by Anonymous Coward on Wednesday September 21 2022, @02:35PM (#1272773)

          If you are doing multiple exposures, it would be pretty easy to dump the one (or two) frames that streak is in.

        • (Score: 2) by JoeMerchant on Wednesday September 21 2022, @03:39PM

          by JoeMerchant (3937) on Wednesday September 21 2022, @03:39PM (#1272788)

          As AC said: think in terms of frame stacking: not a single 60 second (or 60+ minutes) exposure, but a series of exposures at whatever the optimal time for the sensor is, maybe one second?

          Some people are doing multi night captures with thousands of frames captured per night, stacked into a single image.

          --
          🌻🌻 [google.com]
        • (Score: 2) by JoeMerchant on Wednesday September 21 2022, @06:29PM (2 children)

          by JoeMerchant (3937) on Wednesday September 21 2022, @06:29PM (#1272834)

          Just like film: excessively long exposures also lead to saturation and clipping. This is a good read, if you're truly interested:

          https://clarkvision.com/articles/exposure-f-ratio-aperture-and-light-collection/ [clarkvision.com]

          They seem to be in the region of 30-120 seconds per exposure, with clipping already setting in for the subjects in their examples (note the moon at 1/800th of a second). Of course, if you want to go deep-field not into the denser parts of the Milky Way, longer exposures would make sense.

          --
          🌻🌻 [google.com]
          • (Score: 2) by Immerman on Wednesday September 21 2022, @07:35PM (1 child)

            by Immerman (3985) on Wednesday September 21 2022, @07:35PM (#1272849)

            Keep in mind that, unlike for the "pretty night sky photos" you link to, you're almost guaranteed to *want* a lot of saturation and clipping in an astronomical photo. Or at least be unable to avoid it because you can't frame your shot to avoid all of the stars that are much brighter than your target. (aka practically everything visible in a pretty night sky photo). Hence the bright blooms on Webb images where it gets so ridiculously oversaturated that a single pinprick of light becomes a huge "lens flare".

            But sure, a few minutes sounds like a plausible exposure limit for for earthbound astronomy - your telescope is after all spinning at one revolution per day, and it's all but impossible to compensate for that without introducing so much vibration that you can't get a clear image anyway.

            • (Score: 2) by JoeMerchant on Wednesday September 21 2022, @08:08PM

              by JoeMerchant (3937) on Wednesday September 21 2022, @08:08PM (#1272860)

              That's what the image stackers are really good at, if you can keep each individual image reasonably sharp then they can rotate and align and even distort the subsequent images to align with the first (or whichever) one and then average them together to bring up the SNR, pretty dramatically if your individual exposures are 60 seconds and you've got 360 of them. As I said elsewhere, some people go even further and shoot the same part of the sky night after night... not so good for planetary images, but 10 nights in a row knocks down your noise by another factor of 10 (or, at least lets you get a decent number of shots in-between the clouds...)

              --
              🌻🌻 [google.com]
      • (Score: 0) by Anonymous Coward on Wednesday September 21 2022, @03:22PM

        by Anonymous Coward on Wednesday September 21 2022, @03:22PM (#1272786)

        There are all sorts of clever approaches one could take, including what you mention, but none of them are trivial. Recognizing these streaks falls into the bucket for things that are easy for humans are hard for machines. The way you'd have to approach it is very situationally dependent. If you have long exposure frames and have long tracks, that is different than if you have lots of very short integration times with very short tracks, but once they are racked and stacked, make a long track. The more recent work that I've heard (and I don't actively follow this area), like a lot of things these days, waive the magic machine learning wand to say they do or will find and correct them this way.

        For your second paragraph, I wouldn't take that bet. Remember that this is astronomy we're talking about here, so don't assume anyone has sufficient budget to do much more than is necessary. Predicting where and when something will be in a future observation is also not easy. The positions and locations of satellites is not very well known with precision. You can download the ephemerides of them and figure out where they'll be and when, but the error bars on that are not small, and you can't extrapolate too far into the future for predictions because of all the external forces on the satellites, especially the lower orbital ones due to things like atmospheric drag and gravitational nonuniformities.

  • (Score: 4, Informative) by Immerman on Wednesday September 21 2022, @02:24PM (5 children)

    by Immerman (3985) on Wednesday September 21 2022, @02:24PM (#1272768)

    You seem to be under the impression that long exposures are only necessary for chemical film?

    That's hardly true - you need long exposures for digital sensors as well. Digital sensors suffer from noise - like light static overlayed on the image. It's obvious in photos from cheap cameras, and even with a good camera it will be obvious in a photo taken in a pitch black room. And the higher the pixel density on the sensor, the worse the noise.

    If you're imaging a dim object (and that's pretty much everything that hasn't been studied to death already) you need to collect light for an extended period for the tiny amount of light hitting a pixel on the sensor to accumulate to the point that it substantially exceeds the noise floor. Video won't cut it - by reading the sensor you reset the accumulated data, and no amount of multi-frame processing will make up for throwing away your data before it accumulates enough to stand clear of the noise.

    • (Score: 2) by JoeMerchant on Wednesday September 21 2022, @03:56PM

      by JoeMerchant (3937) on Wednesday September 21 2022, @03:56PM (#1272795)

      The question is: what is that optimal exposure time?

      Long exposures for CMOS sensors might be 10 seconds, and if you get a thin streak you don't necessarily have to throw out the whole frame.

      In practice, high investment images these days are composed of thousands of exposures, not 1/60th of a second each to be sure, but thousands none the less.

      These images full of streaks are intentionally emphasizing the issue without applying any mitigation.

      --
      🌻🌻 [google.com]
    • (Score: 2) by JoeMerchant on Wednesday September 21 2022, @05:08PM (3 children)

      by JoeMerchant (3937) on Wednesday September 21 2022, @05:08PM (#1272813)

      >no amount of multi-frame processing will make up for throwing away your data before it accumulates enough to stand clear of the noise.

      Actually, you can take 10,000 frames which all appear to be nothing but random noise, average them, and improve the signal to noise ratio by: can you guess? 10,000:1.

      If you're looking for a (repetitive) cardiac signal on the body, you can do some virtually magical signal processing by ensemble averaging - triggered on the R wave from the EKG. Same thing with stars in the sky: they're constant, even if their signal is only 1% of your noise floor, by capturing 10,000 noisy images in which their +0.01 strength signal is superimposed over that +/- 1.0 strength range noise, the average of those 10,000 images will bring your star out 100x brighter than the noise, and the darker sky between the stars will come up dark in the average of the noise (or, if your sensor is biased you'll get a near uniform background value which you can then subtract out...)

      Do satellite flares diminish image quality? Yes, ever-so-slightly, even when removed by clever editing techniques. Can you even notice the lower image quality in a properly processed final image? Nope. The headline photos showing satellite streaks are emphasizing the problem, not showing its true impact.

      --
      🌻🌻 [google.com]
      • (Score: 1, Informative) by Anonymous Coward on Wednesday September 21 2022, @05:48PM (2 children)

        by Anonymous Coward on Wednesday September 21 2022, @05:48PM (#1272823)

        Beating down the noise for independent and identically distributed (IID) distributions goes as the square root of the number of samples, so your 10,000 frames would (in principle) knock down your noise by a factor of 100, not 10,000. However, frame averaging only gets you so far when you consider the other sensor noise sources, such as the read noise, fixed pattern noise, etc.. You quickly break the IID assumption when you start changing your background signal because now you're getting variable shot noise thrown in. There are ways to approach all of these things, and you end up with an associated residual error in all of these corrections that end up getting added in quadrature. Assuming you can do a decent job of co-aligning all of your frames, when you rack and stack them it will certainly look much better for it, but you'll find the useful bottom end on your signal is going to be much higher than you might think it should be.

        Everything mentioned so far is great for single band imagery, but if you are using a filter set, now you have the issue where your streak is going to be in different locations for each filter color you use further challenging trying to extract multi-spectral information where you are assuming your combined color image is representative of the scene.

        • (Score: 2) by JoeMerchant on Wednesday September 21 2022, @08:17PM (1 child)

          by JoeMerchant (3937) on Wednesday September 21 2022, @08:17PM (#1272862)

          All good points, and my reference for ensemble averaging is actually in the area of time series data containing cardiac ventricular volumes buried underneath respiratory motions. We would average about 60 heartbeats worth of signal and the cardiac signal would indeed grow to 60x its previous size relative to the (more or less) uncorrelated respiratory signal. There are actually correlations between when your heart beats and your respiratory cycle, but the correlations weren't strong enough to show up in an average of 60 heartbeats.

          We tried all kinds of crazy signal processing ideas over the years, most were pretty meh when applied to real world data, but ensemble averaging exceeded expectations every time we found an application for it, really dramatically good results - with the inherent limitations of the method: your signal has to be repeating, noise at least mostly uncorrelated, what you are getting isn't an instantaneous read but a read of the averaged samples, etc.

          --
          🌻🌻 [google.com]
          • (Score: 0) by Anonymous Coward on Wednesday September 21 2022, @10:26PM

            by Anonymous Coward on Wednesday September 21 2022, @10:26PM (#1272903)

            That sounds like something akin to a lock-in amplifier technique where you're using the heartbeat as your oscillator and contributions from anything not correlated with your oscillator attenuate out. It is a wonderful technique, but doesn't seem to be very well known amongst even many of my colleagues who work in labs. There are some techniques like this where I understand the math behind it, and even worked through derivations of the math, but when you see it off the paper and working in a lab do still feel a bit like magic. Especially this going back and forth between the time and frequency domains (or pixel and spatial frequency domains).