Jump to content

How Much Money Are Southwest, American, United, and Others


Razors Edge

Recommended Posts

...dumping into lobbying for quicker changeover to autonomous passenger flights now? 

I'm thinking, if they weren't last week, they are now.  Airline lobbyists are probably wining and dining the heck out of Congresspeople right now.

If Shatner doesn't die today, you can bet the argument of "if we can put 90yr old actors in space, we can fly a plane from NY to LA" is being bandied about.

Link to comment
Share on other sites

2 minutes ago, Razors Edge said:

...dumping into lobbying for quicker changeover to autonomous passenger flights now? 

I'm thinking, if they weren't last week, they are now.  Airline lobbyists are probably wining and dining the heck out of Congresspeople right now.

If Shatner doesn't die today, you can bet the argument of "if we can put 90yr old actors in space, we can fly a plane from NY to LA" is being bandied about.

Anyone can........so long as nothing goes wrong.  The pilot is there for when the shit hits the fan and someone needs to know something besides what's in the program.

Link to comment
Share on other sites

3 minutes ago, maddmaxx said:

Anyone can........so long as nothing goes wrong.  The pilot is there for when the shit hits the fan and someone needs to know something besides what's in the program.

Yeah, but that's the weak point if you are an airline (or trucking industry or the train system etc.).  We're literally seeing it happen across transportation, and, like automation elsewhere, a business is going to look to "fix" the squeaky bits ASAP.  Have a few too many strikes or wage negotiations at a plant? Add robots. Have a hard time hiring burger flippers? Add robots.  It's a time honored process, and I think the space stuff either pushes it faster in that direction, or with a catastrophe of some kind, sets it back as they work more behind the scenes to change things in their favor.

Link to comment
Share on other sites

2 minutes ago, Razors Edge said:

Yeah, but that's the weak point if you are an airline (or trucking industry or the train system etc.).  We're literally seeing it happen across transportation, and, like automation elsewhere, a business is going to look to "fix" the squeaky bits ASAP.  Have a few too many strikes or wage negotiations at a plant? Add robots. Have a hard time hiring burger flippers? Add robots.  It's a time honored process, and I think the space stuff either pushes it faster in that direction, or with a catastrophe of some kind, sets it back as they work more behind the scenes to change things in their favor.

Do you purchase insurance?  How do you balance out the upside and downside of a robot arm malfunctioning or an airplane crash.

Answering this says a lot about various things.

  • Heart 1
Link to comment
Share on other sites

2 minutes ago, maddmaxx said:

Do you purchase insurance?

For what?  Are you worried about dying in a plane crash?  I can't say that will go down completely to zero once human error is removed, but you probably can assume it will go down, so maybe your need for travel insurance will decline????

Link to comment
Share on other sites

12 minutes ago, Razors Edge said:

Yeah, but that's the weak point if you are an airline (or trucking industry or the train system etc.).  We're literally seeing it happen across transportation, and, like automation elsewhere, a business is going to look to "fix" the squeaky bits ASAP.  Have a few too many strikes or wage negotiations at a plant? Add robots. Have a hard time hiring burger flippers? Add robots.  It's a time honored process, and I think the space stuff either pushes it faster in that direction, or with a catastrophe of some kind, sets it back as they work more behind the scenes to change things in their favor.

I see your point and to me it’s not a technology issue as we know it can be done.  It’s a matter of trusting the technology (will they make any money) and trial in the court of public opinion should a pilotless plane go down killing 200 passengers. 

Link to comment
Share on other sites

1 minute ago, ChrisL said:

I see your point and to me it’s not a technology issue as we know it can be done.  It’s a matter of trusting the technology (will they make any money) and trial in the court of public opinion should a pilotless plane go down killing 200 passengers. 

For sure.  Especially at first - like with Tesla and it's autopilot.  A crash - especially a fatal one - makes headlines versus the 30,000 deaths and millions of automobile crashes per yer (in the US alone).  Public opinion counts A LOT.  But likely it becomes an incremental approach where the actual "switch" date is impossible to determine.  One day it seems like we are all driving ourselves around, and then suddenly, we're not. :D

It may happen in the cargo plane world and military world earlier as they have more control and less to lose (in a human cargo vs material cargo) sense, but if that goes on with little drama, the switch might be faster than we imagine.

  • Heart 1
Link to comment
Share on other sites

2 hours ago, Razors Edge said:

I can't say that will go down completely to zero once human error is removed, but you probably can assume it will go down

This seems to make the assumption that a pilot will make errors more frequently than the engineers who program software that pilots the autonomous plane.

Piloting a plane and writing software are human activities.  All human activities involve some unavoidable level of error.  Replacing human pilots with autonomous planes piloted by human-written software could very well just exchange one set of errors for another.

Another way to look at it is a human pilot has skin (his own!) in the game, while the engineer will be sitting in an office when his software flies the plane.

  • Heart 1
Link to comment
Share on other sites

1 hour ago, Prophet Zacharia said:

You’d want to remove that small safety feature on your flights?

Well, you could train a flight attendant to step in as required, I guess?

40 minutes ago, Thaddeus Kosciuszko said:

This seems to make the assumption that a pilot will make errors more frequently than the engineers who program software that pilots the autonomous plane.

Piloting a plane and writing software are human activities.  All human activities involve some unavoidable level of error.  Replacing human pilots with autonomous planes piloted by human-written software could very well just exchange one set of errors for another.

Another way to look at it is a human pilot has skin (his own!) in the game, while the engineer will be sitting in an office when his software flies the plane.

And yet...we will have autonomous cars, trucks, planes, ships, trains, and space ships.  Sort of eerie, I guess? 

Link to comment
Share on other sites

11 minutes ago, Wilbur said:

Barista’s make fine engineers too. No education required. 

So maybe that's just another "emergency aisle" passenger responsibility?  "In the event of an autonomous flight malfunction, are you willing and able to press the big red reset button?"  I'd say yes for the extra legroom for sure!

Link to comment
Share on other sites

1 hour ago, Thaddeus Kosciuszko said:

This seems to make the assumption that a pilot will make errors more frequently than the engineers who program software that pilots the autonomous plane.

Piloting a plane and writing software are human activities.  All human activities involve some unavoidable level of error.  Replacing human pilots with autonomous planes piloted by human-written software could very well just exchange one set of errors for another.

Another way to look at it is a human pilot has skin (his own!) in the game, while the engineer will be sitting in an office when his software flies the plane.

Not only that but having redundant, even if less than perfect overlapping systems actually improves the odds.  That's what we have today.  An aircraft is basically capable of flying itself now and is used in that mode often.  The pilot is there to back up the automatic system.

Link to comment
Share on other sites

1 minute ago, maddmaxx said:

Not only that but having redundant, even if less than perfect overlapping systems actually improves the odds.  That's what we have today.  An aircraft is basically capable of flying itself now and is used in that mode often.  The pilot is there to back up the automatic system.

Yep -redundancy is good. And improving the odds is good.  Assuming, at some point the non-human redundancies become significantly better than the driver/pilot/captain taking up space and drawing a salary, the transition will be made and public better for it...

...until Skynet (or the Matrix), but that's a different issue.

Link to comment
Share on other sites

4 hours ago, Razors Edge said:

Yeah, but that's the weak point if you are an airline (or trucking industry or the train system etc.).  We're literally seeing it happen across transportation, and, like automation elsewhere, a business is going to look to "fix" the squeaky bits ASAP.  Have a few too many strikes or wage negotiations at a plant? Add robots. Have a hard time hiring burger flippers? Add robots.  It's a time honored process, and I think the space stuff either pushes it faster in that direction, or with a catastrophe of some kind, sets it back as they work more behind the scenes to change things in their favor.

Great points.  Either we end up with a fully-paid 24-32 hour work week or unemployment with be permanently 25% or higher in a generation with a small number of people making lots of money and a large number of people working for peanuts.

I tell my 13 year-old nephew Adam to make sure he works hard in math and stays in the gifted-talented math classes: math is going to be the best way to qualify for good jobs in the future.  On the national and state tests, my he scored in the top 0.1% in Math, the top 1% in science and the top 2% in English. 

When he was 8 and I picked him up from the practice-all-Winter elite baseball team he was on, he asked me, "How do stocks work?"  I hope he keeps it up and doesn't become brain-dead from all the hours spent on video games.

  • Heart 2
Link to comment
Share on other sites

7 hours ago, Thaddeus Kosciuszko said:

This seems to make the assumption that a pilot will make errors more frequently than the engineers who program software that pilots the autonomous plane.

Piloting a plane and writing software are human activities.  All human activities involve some unavoidable level of error.  Replacing human pilots with autonomous planes piloted by human-written software could very well just exchange one set of errors for another.

Another way to look at it is a human pilot has skin (his own!) in the game, while the engineer will be sitting in an office when his software flies the plane.

An exercise in a Programable Logic Controller class I took was programing an intersection traffic light. Turning lanes, pedestrian cross walk and two traffic lanes. 

They had a simulator that ran your program, and showed the results on the big screen at the front of the room.

It was down right comical. Out of a class of about 20 I don't remember anyone's program running perfectly.

Link to comment
Share on other sites

So, let's say you get to choose.

Choice 1: Board a plane on which the entire trip is flown under the control of a human pilot with years of flying experience and extensive training in responding to out of normal events,

or

Choice 2: Board a plane on which the entire trip is flown under the control of a flight system designed by an engineer with a PhD in avionics control systems where the system has undergone extensive computer simulation testing in response to out of normal events.

 

Some might say that's a false choice because neither accurately represents commercial passenger flight today, but I would suggest the choice strips away much of the distracting clutter to focus on which area of human error is more comfortable to you.

  • Heart 1
Link to comment
Share on other sites

10 hours ago, Thaddeus Kosciuszko said:

So, let's say you get to choose.

Choice 1: Board a plane on which the entire trip is flown under the control of a human pilot with years of flying experience and extensive training in responding to out of normal events,

or

Choice 2: Board a plane on which the entire trip is flown under the control of a flight system designed by an engineer with a PhD in avionics control systems where the system has undergone extensive computer simulation testing in response to out of normal events.

 

Some might say that's a false choice because neither accurately represents commercial passenger flight today, but I would suggest the choice strips away much of the distracting clutter to focus on which area of human error is more comfortable to you.

That's NOW.  But that isn't TOMORROW.  Just like if you asked the Moon landing astronauts if they wanted to be at the "controls" or not versus William Shatner.  This stuff actually evolves and changes as we get better at it.  Perfect? Heck no, but we weren't perfect with humans in full control nor partial control, so why expect perfection ever?

11 hours ago, Further said:

An exercise in a Programable Logic Controller class I took was programing an intersection traffic light. Turning lanes, pedestrian cross walk and two traffic lanes. 

They had a simulator that ran your program, and showed the results on the big screen at the front of the room.

It was down right comical. Out of a class of about 20 I don't remember anyone's program running perfectly.

So you think folks gave up on PLC at that point? The DOT threw their hands in the air and said "Let's just keep Barney Fife at the intersection and let him direct traffic"????  I wonder how many hours each person/team in that class put into that programming task versus even the simplest iPhone app?  Think they could have improved their deliverable with a bit of testing and rework? Or did you folks max out the potential and it just couldn't be any better?

Link to comment
Share on other sites

11 hours ago, Further said:

An exercise in a Programable Logic Controller class I took was programing an intersection traffic light. Turning lanes, pedestrian cross walk and two traffic lanes. 

They had a simulator that ran your program, and showed the results on the big screen at the front of the room.

It was down right comical. Out of a class of about 20 I don't remember anyone's program running perfectly.

My last 2 years at P$W Aircraft I wrote algorithms for the PLC systems controlling the jet engine test cells.  We were using the Square D "Symax" system and their company told us that it was to the best of their knowledge the largest PLC program in the country, even larger than those used on Detroit's auto assembly lines.  I had recently achieved my degree as an engineer after years of working as a technician repairing similar systems.  I had 5 other engineers working on the team in addition to myself.  After a while they took to writing the code for the algorithms that I designed because they couldn't get the system to work properly.  The biggest failing.........they usually failed to examine the "what if the action doesn't take place as designed" portions of the program.  Simple things like assuming that button was off if not pushed on clouded their minds.

I loved that stuff.  They were the best two years of my life.  I'm good at "what if it doesn't work the way you thought it would" stuff.

Expect random failures and you will never be disappointed.  :nodhead:

Edit:  When I was at Square D programming school as a student they offered me a job as an instructor.........

  • Awesome 2
Link to comment
Share on other sites

13 minutes ago, maddmaxx said:

My last 2 years at P$W Aircraft I wrote algorithms for the PLC systems controlling the jet engine test cells.  We were using the Square D "Symax" system and their company told us that it was to the best of their knowledge the largest PLC program in the country, even larger than those used on Detroit's auto assembly lines.  I had recently achieved my degree as an engineer after years of working as a technician repairing similar systems.  I had 5 other engineers working on the team in addition to myself.  After a while they took to writing the code for the algorithms that I designed because they couldn't get the system to work properly.  The biggest failing.........they usually failed to examine the "what if the action doesn't take place as designed" portions of the program.  Simple things like assuming that button was off if not pushed on clouded their minds.

I loved that stuff.  They were the best two years of my life.  I'm good at "what if it doesn't work the way you thought it would" stuff.

Expect random failures and you will never be disappointed.  :nodhead:

Edit:  When I was at Square D programming school as a student they offered me a job as an instructor.........

I attended one of those for TI PLCs in Johnson City, TN. I went with one of the vendors for equipment we were buying and he had his panel lit up like a Christmas tree with running lights and all. It was quite impressive. And so was the area and mountains around there. Beautiful!  I enjoyed the rental car I had!  I seem to remember it being an Olds with standard transmission, but that  would be quite odd for a rental car. 

  • Heart 1
  • Awesome 1
Link to comment
Share on other sites

The unexpected failure mode can be a lot of fun.  At Fleet Sonar School in Key West there was a challenge for the graduating class in what was the trickest system of the day (67).  After months of being tested by the instructors on system repair the students were allowed to create a test problem for the instructors.  The challenge of course involved a lot of beer and the promise that the instructors had never failed to find and repair the problem.

After hours of testing the lead instructor declared that we were fucking with them and had not installed a problem to find.  As he did this he slammed his hand down on the emergency off button.  The system kept operating normally.  ;)

They protested the call but lost.

I had shorted the momentary open contacts that broke the power chain.  You couldn't turn the system off with the off button.

:loveshower:

  • Awesome 1
  • Haha 2
Link to comment
Share on other sites

11 hours ago, Thaddeus Kosciuszko said:

So, let's say you get to choose.

Choice 1: Board a plane on which the entire trip is flown under the control of a human pilot with years of flying experience and extensive training in responding to out of normal events,

or

Choice 2: Board a plane on which the entire trip is flown under the control of a flight system designed by an engineer with a PhD in avionics control systems where the system has undergone extensive computer simulation testing in response to out of normal events.

 

Some might say that's a false choice because neither accurately represents commercial passenger flight today, but I would suggest the choice strips away much of the distracting clutter to focus on which area of human error is more comfortable to you.

 

Or Choice 3: Board a plane on which the entire trip is flown under the control of a 'pilot' sitting in a bunker with monitors and a joy stick operating a flight system designed by an engineer with a PhD in avionics control systems where the system has undergone extensive computer simulation testing in response to out of normal events.  This 'pilot' is the same guy that operated drones in the over the middle-east looking for chimneys to shoot his ordnance down.  Flight Simulator 2025.

 

Link to comment
Share on other sites

4 hours ago, Razors Edge said:

That's NOW.  But that isn't TOMORROW. 

But it IS tomorrow.  Because if tomorrow means transportation by autonomous vehicles operated/piloted by software, any piece of software has a human origin.  Even if software is 'written' by other software, at some point a human had a hand in the chain.

And since humans are not error-free, neither are the complex programs they create to operate autonomous vehicles.  Agreed that the programs will improve as programmers gain more experience, but they will never be error-free.  The human element never goes away, it simply moves from the driver's seat to the programmer's office chair.

As Max said:

4 hours ago, maddmaxx said:

The biggest failing.........they usually failed to examine the "what if the action doesn't take place as designed" portions of the program. 

 

A software program may appear to be intelligent, and it will retain that appearance as long as the inputs and outputs match what the programmer had intended and the programming sequences follow as the programmer envisioned.  Max's example of the emergency button is an excellent one - the software (written by a human) could not detect when a device supplying information to the program had turned faulty.  I.e., the action didn't take place as designed.

To the point - a human would have recognized after mashing the E-stop button that something was wrong when the machine didn't stop.  The human then takes a secondary action such as opening the power disconnect to shut down the machine.  The software keeps humming right along, ignorant of the input and the conditions that demanded the machine to stop.  Because the human who wrote the program never 'told' it to look for a problem in the E-stop, never mind what to do if it found one.

Link to comment
Share on other sites

11 minutes ago, Thaddeus Kosciuszko said:

To the point

...quite simply: Autonomous planes, trains, automobiles, and ships ARE on the horizon.  The pros/cons and the acceptable risks we end up with is gonna be what they are.

I'm not sure if you are arguing autonomous vehicles are not coming or just that there will be some challenges and risks that are part of that change over.  The latter is obvious, and the former is just wrong.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...