Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Monday January 08 2018, @01:39AM   Printer-friendly
from the electric-everywhere dept.

Arthur T Knackerbracket has found the following story:

Australia's first electric aircraft has begun test flights at Perth's Jandakot Airport, amid hopes the plane will be flying to nearby Rottnest Island within months.

The two-seater single-engine Pipistrel Alpha Electro has two batteries that can keep the plane in the air for an hour, with an extra 30 minutes in reserve.

The team behind the plane says while there are environmental benefits in doing away with jet fuel, electric planes are also safer and easier to fly.

"Electric propulsion is a lot simpler than a petrol engine," Electro.Aero founder Joshua Portlock said. "Inside a petrol engine you have hundreds of moving parts. "In this aircraft you have one switch to turn the aircraft on and one throttle lever to fly."

The engine is powered by two lithium-ion batteries, similar to those used in the Tesla electric car. There is no gear box or multiple moving engine parts —instead the plane's motor attaches directly to the propeller. Rather than a fuel gauge, a panel tells the pilot the amount of power left in the battery, and estimated minutes of flight time, based on the throttle position.

The batteries are re-energised in about an hour by a supercharger based at the Jandakot airfield.

[...] In mid-January Mr Bodley will begin training local pilots to fly the single-engine electric plane, with registered pilots required to complete a familiarisation flight before flying solo.

Mr Portlock said the group had held discussions with the Rottnest Island Authority to install a supercharger to tap into its solar array, allowing pilots to fly the plane to the island.

Future plans include electric air-taxis capable of carrying up to five people to the holiday destination.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by dry on Tuesday January 09 2018, @03:24AM (3 children)

    by dry (223) on Tuesday January 09 2018, @03:24AM (#619854) Journal

    I wonder how much it was the pilot and how much it was the computer or dumb luck that they landed on a busy waterway without hitting any boats?
    Hmm reading wiki, it seems it was all the pilots decision with the fly by wire system first helping the glide and then stopping the pilot achieving maximum landing flare which would have softened the impact.
    Reminds me of the Gimli glider, where the pilot straddled a guard rail to shed speed and not run over the kids in front of the aircraft. Whether a computer would have made that decision is questionable, along with the computer being aware of the abandoned air field in the middle of nowhere and being capable of gliding a 767 which was considered unglidable. It was also an example of bad data (mixing up imperial and metric) along with a defective gas gauge in leading to big problems. Better outcome then the Mars Climate Orbiter which was on full autopilot.
    https://en.wikipedia.org/wiki/Gimli_Glider [wikipedia.org]

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by JoeMerchant on Tuesday January 09 2018, @01:08PM (2 children)

    by JoeMerchant (3937) on Tuesday January 09 2018, @01:08PM (#619983)

    Certainly, human pilots tend to have more experience and data input sources (primarily visual scanning field) than autopilots and autopilot software authors do today....

    However, we are quickly approaching the cusp at which autopilots will have more data input sources that they can process - more than a human will ever be capable of. (Good) autopilot software should be accumulating knowledge in a relatively immortal fashion, and eventually be trained beyond the capacity of 30 years of human experience.

    There will always remain situations where a human can find a creative novel, potentially superior, solution to a unique problem that an autopilot would miss - but the point at which that advantage no longer outweighs the autopilots' massive database, lightning reflexes, and aggregation of more real-time data sources would seem to be approaching.

    Meanwhile, bad stuff will be happening during autopilot development. I find it interesting that this Google search:

    https://www.google.com/search?q=gainesville+autopilot+crash+terminal [google.com]

    does not turn up the story of researchers who had outfitted a light aircraft with experimental more-autonomous autopilot hardware/software and accidentally crashed it into the Gainesville airport terminal with some pretty serious consequences. Details are fuzzy in my head, but it happened within ~ the last 10 years and was considered an extreme setback in that particular autopilot development program.

    --
    🌻🌻 [google.com]
    • (Score: 2) by dry on Tuesday January 09 2018, @05:30PM (1 child)

      by dry (223) on Tuesday January 09 2018, @05:30PM (#620105) Journal

      You're probably right about autopilots getting better and better and overall improving things. Perhaps it is human nature or just my age to be distrustful of machines driving/flying autonomously, especially the idea of totally eliminating the possibility of a human piloting a vehicle.
      As for your search, it is interesting how Google serves results to different people. Here, the second result, sandwiched between 2 results of a Tesla fatal autopilot crash was http://www.gainesville.com/article/LK/20080407/news/604159153/GS/ [gainesville.com] which is probably the accident you're thinking off. Seems there was too much damage to say what exactly caused the crash but does talk about the autopilot.

      • (Score: 2) by JoeMerchant on Tuesday January 09 2018, @06:32PM

        by JoeMerchant (3937) on Tuesday January 09 2018, @06:32PM (#620136)

        Yep, that's the one - my first page of results was over 50% Tesla and didn't have this, which is doubly odd because I've used Google to search for and read this exact article in the past...

        The thing I trust autopilots with the least is negotiating with human pilots, especially in crowded conditions like freeway traffic.

        As for autopilot development - we lost a drone during development (apparently it went down deep in the woods - best possible result after a recovery without damage) and post-loss analysis showed us about a half dozen decisions which led to the loss, any one of those decisions being taken more conservatively would have saved the day (flying without the RDF tracker attached was my favorite mistake to remind people of) - but... this was about flight #25 in the program, and if we scrubbed for every "maybe" during development, at least half of the prior flights would have been grounded. Risk is one expense in progress.

        --
        🌻🌻 [google.com]