Insights Into Tomorrow: Episode 6 “Artificial Intelligence”

Is Artificial Intelligence the future of human development or the potential downfall of human civilization?

What is the state of Artificial Intelligence today?

What are the pro’s and con’s of artificial intelligence?

We address these and many other questions in this episode of Insights Into Tomorrow. We’ll take a look at the four different levels of Artificial Intelligence and where today’s rudimentary AI has the potential of going.

Then we’ll ask the tough questions about AI. Should we allow AI to develop self awareness? Should AI be able to make life and death decisions? Will AI rob humans of jobs and safety.

There are a lot of questions and a lot of concerns about AI that need to be asked and answered before we get too far down the rabbit hole and reach a point of no return.

Check out our recently published article on this subject available now on Medium.com.

Show Notes

  • Introductions
    • Insights Into Tomorrow Episode 6 “Artificial Intelligence”
    • My co-host Sam Whalen
        
  • What is Artificial Intelligence
    • Artificial intelligence – simulating human intelligence in machines – used to be confined to science fiction. But in recent decades, it’s broken into the real world, becoming one of the most important technologies of our time. In addition to being the brains behind facial recognition, AI is helping to solve critical problems in transportation, retail and health care (spotting breast cancer missed by human eyes, for example). On the internet, it’s used for everything from speech recognition to spam filtering. Movie Studios plan to use AI to analyze its potential movies and choose which ones to put into development.
       
      [ADVERTISEMENT 1 – The Second Sith Empire] 
  • Four types of Artificial Intelligence
    • REACTIVE MACHINES
      • The most basic types of AI systems are purely reactive, and have the ability neither to form memories nor to use past experiences to inform current decisions.
      • Deep Blue, IBM’s chess-playing supercomputer, which beat international grandmaster Garry Kasparov in the late 1990s, is the perfect example of this type of machine.
        • Deep Blue can identify the pieces on a chess board and know how each moves.
        • It can make predictions about what moves might be next for it and its opponent.
        • And it can choose the most optimal moves from among the possibilities.
        • But it doesn’t have any concept of the past, nor any memory of what has happened before.
      • Google’s AlphaGo, which has beaten top human Go experts, can’t evaluate all potential future moves either.
        • Its analysis method is more sophisticated than Deep Blue’s, using a neural network to evaluate game developments.
        • These methods do improve the ability of AI systems to play specific games better, but they can’t be easily changed or applied to other situations.
        • These computerized imaginations have no concept of the wider world – meaning they can’t function beyond the specific tasks they’re assigned and are easily fooled.
            
    • LIMITED MEMORY
      • This Type II class contains machines can look into the past.
        • Self-driving cars do some of this already.
          • For example, they observe other cars’ speed and direction.
          • That can’t be done in a just one moment, but rather requires identifying specific objects and monitoring them over time.
          • These observations are added to the self-driving cars’ preprogrammed representations of the world, which also include lane markings, traffic lights and other important elements, like curves in the road.
              
    • THEORY OF MIND
      • Machines in the next, more advanced, class not only form representations about the world, but also about other agents or entities in the world.
      • In psychology, this is called “theory of mind” – the understanding that people, creatures and objects in the world can have thoughts and emotions that affect their own behavior.
      • Without understanding each other’s motives and intentions, and without taking into account what somebody else knows either about me or the environment, working together is at best difficult, at worst impossible.
          
    • SELF-AWARENESS
      • The final step of AI development is to build systems that can form representations about themselves.
      • Consciousness is also called “self-awareness” for a reason. (“I want that item” is a very different statement from “I know I want that item.”)
      • Conscious beings are aware of themselves, know about their internal states, and are able to predict feelings of others.
        • We assume someone honking behind us in traffic is angry or impatient, because that’s how we feel when we honk at others.
        • Without a theory of mind, we could not make those sorts of inferences.
            

[ADVERTISEMENT 2 – Insights Into Entertainment] 

  • Pros and Cons of today’s Artificial Intelligence
    • Pros
      • Reduction in human error
        • AI in weather forecasting has significantly increased the reliability of weather forcasts
      • Reduce risks to humans
        • AI robots could be used to clean up nuclear and chemical waste (Chernobyl/Fukishima)
      • Available 24×7
        • AI doesn’t need to sleep, eat or take breaks. AI can process bank documents constantly improving overall efficiency
      • Digital Assistance
        • AI powered customer service centers can provide fast, efficient service on a consistent basis at any hour of the day
      • Faster Decisions
        • AI is capable of analyzing input from a wide array of sensors and making quick, decisive choices that could mean life and death in things such as a manufacturing environment far faster than a human could react to the input
      • Enhancing Human Functions
        • Advanced AI is capable of detecting breast cancer at earlier stages than humans alone are capable of doing
            
    • Cons
      • High cost of creation
        • Increasingly sophisticated AI requires a constant improvement in hardware and software which can get costly quickly
      • Unemployment
        • The dystopian fear of intelligent machines replacing human jobs is a very real threat
      • Emotionless
        • Current AI technology lacks the ability to form bonds with humans
      • Static Thinking
        • Today’s AI lacks the ability to think outside the box. It can’t conceive of the world outside of the confines of it’s specific program
      • No improvement over time
        • Unlike humans who learn and adapt with experience current AI is purpose built and cannot evolve based on observation and input from his tasks

 [ADVERTISEMENT 3 – Insights Into Teens] 

  • Artificial Intelligence and the Future
    • The Good
      • Fleets of self driving vehicles replacing unsafe drives and human error on the roads
        • Intelligent digital assistants taking steps well beyond today’s Siri and Alexa functionality
        • Automated public transportation systems
        • Automated intelligent medical monitoring and proactive prevention of illness
        • Curated text books, virtual tutors and facial analysis helping out with Education
        • Media uses AI to analyze financial reports and news trends
        • Natural language abilities can translate multiple language in real time
            
    • The Bad
      • Fewer, more specialized jobs and the reduction of manufacturing jobs
      • Insurance companies determining claims processing based on AI accumulated data
      • General loss of marketable skills as AI takes over the more mundane jobs of humans
      • Humans getting progressively more lazy as machines do more and more for us
      • Deepfakes used to control the media and popular opinion
      • Use of AI for identity theft and abuse of personal data
          
    • The Ugly
      • Automated military drones deciding when to use deadly force
      • Intelligence enables control, AI that is more intelligent than humans can institute controls over humans
      • Super intelligent AI could make humans obsolete and therefore a threat to AI
      • AI used for political manipulation to game Voting systems and control the voting population

[TRANSITION] 

  • Final Thoughts and Closing Remarks

Transcription

0:00:01.730,0:00:10.130
insightful podcasts by informative host

0:00:11.750,0:00:20.500
[Music]

0:00:15.109,0:00:23.670
insights into things a podcast network

0:00:20.500,0:00:23.670
[Music]

0:00:25.890,0:00:31.560
welcome to insights into tomorrow where

0:00:29.490,0:00:34.470
we take a deeper look into how the

0:00:31.560,0:00:38.190
issues of today will impact the world of

0:00:34.470,0:00:41.160
tomorrow from politics and world news to

0:00:38.190,0:00:43.230
media and technology we discuss how

0:00:41.160,0:00:45.020
today’s headlines are becoming

0:00:43.230,0:00:57.430
tomorrow’s reality

0:00:45.020,0:00:59.609
[Music]

0:00:57.430,0:01:02.710
[Applause]

0:00:59.609,0:01:06.189
welcome to insights into tomorrow this

0:01:02.710,0:01:08.770
is episode 6 artificial intelligence I’m

0:01:06.189,0:01:11.470
your host Joseph Whelan and my co-host

0:01:08.770,0:01:13.090
Sam Whelan how you doing today Sam I’m

0:01:11.470,0:01:14.860
doing ok all things considered

0:01:13.090,0:01:16.689
ok it looks like I caught you off guard

0:01:14.860,0:01:17.850
with that comment you never see it

0:01:16.689,0:01:20.700
coming

0:01:17.850,0:01:23.950
so today we’re going to be talking about

0:01:20.700,0:01:27.189
artificial intelligence which seems to

0:01:23.950,0:01:29.110
be invading every aspect of our lives in

0:01:27.189,0:01:30.880
fact we had to unplug our piece of

0:01:29.110,0:01:33.119
artificial technology so that we didn’t

0:01:30.880,0:01:37.030
get bothered by anyone showing up today

0:01:33.119,0:01:39.700
so let’s just start off by talking about

0:01:37.030,0:01:41.799
what is artificial intelligence in your

0:01:39.700,0:01:45.369
own words define what artificial

0:01:41.799,0:01:46.539
intelligence is to you I guess if I had

0:01:45.369,0:01:49.240
to put it into my own where it’s

0:01:46.539,0:01:52.659
probably the humanity’s ability to

0:01:49.240,0:01:55.899
create programs I can think for

0:01:52.659,0:01:57.390
themselves on well maybe I don’t think

0:01:55.899,0:02:00.869
for themselves I don’t mean entirely

0:01:57.390,0:02:03.879
it’s like what’s the word a self-aware

0:02:00.869,0:02:05.709
but programs that can function on their

0:02:03.879,0:02:07.509
own whether that be completing a task

0:02:05.709,0:02:09.340
over and over again or something more

0:02:07.509,0:02:11.500
advanced like memory you know storing

0:02:09.340,0:02:15.220
memory things like ok and I think that’s

0:02:11.500,0:02:18.280
that’s certainly a valid definition I

0:02:15.220,0:02:21.069
think there’s several definitions really

0:02:18.280,0:02:23.410
and we’ll talk about those but the

0:02:21.069,0:02:25.810
clinical definition from the research

0:02:23.410,0:02:28.269
that I did is that artificial

0:02:25.810,0:02:33.010
intelligence is the simulation of human

0:02:28.269,0:02:34.900
intelligence in machines and it used to

0:02:33.010,0:02:37.180
be confined to science fiction but in

0:02:34.900,0:02:39.099
recent decades it’s broken into the real

0:02:37.180,0:02:41.769
world it’s becoming one of the most

0:02:39.099,0:02:44.620
important technologies in our time and

0:02:41.769,0:02:47.440
really it’s in everything so in addition

0:02:44.620,0:02:49.959
to being the brains behind facial

0:02:47.440,0:02:52.030
recognition AI is helping to solve

0:02:49.959,0:02:55.180
critical problems in transportation

0:02:52.030,0:02:59.170
retail healthcare its spotting breast

0:02:55.180,0:03:01.239
cancer missed by human eyes on the

0:02:59.170,0:03:04.110
internet it’s used for everything from

0:03:01.239,0:03:07.420
speech recognition to spam filtering

0:03:04.110,0:03:09.250
movie studios even start to plan plan to

0:03:07.420,0:03:10.540
start using AI to analyze potential

0:03:09.250,0:03:12.189
movies that

0:03:10.540,0:03:16.420
choose which ones to put in the

0:03:12.189,0:03:18.430
development so it’s really depending on

0:03:16.420,0:03:20.859
your aspect of it it’s really a matter

0:03:18.430,0:03:24.670
of replacing human thinking it seems

0:03:20.859,0:03:25.719
like in in many cases so what we’re

0:03:24.670,0:03:28.120
going to do is we want to talk about

0:03:25.719,0:03:30.819
there’s four basic classifications of

0:03:28.120,0:03:32.319
artificial intelligence two of which

0:03:30.819,0:03:36.010
that have already been realized with

0:03:32.319,0:03:38.920
current technology two of which have not

0:03:36.010,0:03:42.430
and potentially could go in different

0:03:38.920,0:03:44.019
directions then we’ll look at the pros

0:03:42.430,0:03:47.590
and cons of today’s artificial

0:03:44.019,0:03:50.349
intelligence and we’ll look at

0:03:47.590,0:03:51.849
intelligence in the future which is you

0:03:50.349,0:03:54.040
know one of the things that we always do

0:03:51.849,0:03:55.540
and we’ll look at the good the bad and

0:03:54.040,0:03:57.040
the ugly of where artificial

0:03:55.540,0:04:00.099
intelligence could go because there are

0:03:57.040,0:04:03.010
some some very well-known critics of

0:04:00.099,0:04:06.400
artificial intelligence who are its

0:04:03.010,0:04:08.290
detractors so let’s get started we’ll

0:04:06.400,0:04:19.000
talk about our four types of artificial

0:04:08.290,0:04:21.039
intelligence first so the first type of

0:04:19.000,0:04:24.430
artificial intelligence that were going

0:04:21.039,0:04:26.139
to look at our reactive machines so

0:04:24.430,0:04:30.000
these are the most basic types of AI

0:04:26.139,0:04:32.110
systems and they are purely reactive

0:04:30.000,0:04:34.360
they don’t have the ability to form

0:04:32.110,0:04:38.320
memories they don’t have the ability to

0:04:34.360,0:04:42.909
recognize past experiences and probably

0:04:38.320,0:04:44.889
the biggest example of this is IBM’s

0:04:42.909,0:04:48.789
deep blue of you have you are you

0:04:44.889,0:04:50.110
familiar with deep blue no I believe in

0:04:48.789,0:04:51.639
the notes is that the one the chess

0:04:50.110,0:04:53.770
machine that’s correct

0:04:51.639,0:04:56.260
okay um yeah no no I mean I’ve heard of

0:04:53.770,0:04:57.490
different programs you know going up

0:04:56.260,0:04:59.740
against chess players but not that one

0:04:57.490,0:05:03.370
specifically so so deep blue was a whole

0:04:59.740,0:05:05.680
computer system that IBM had built with

0:05:03.370,0:05:07.659
the sole purpose it was a a

0:05:05.680,0:05:11.169
supercomputer with the sole purpose of

0:05:07.659,0:05:12.430
playing chess better than humans I don’t

0:05:11.169,0:05:15.039
know if you’ve ever have you ever played

0:05:12.430,0:05:16.930
computer chess on the PC or anything no

0:05:15.039,0:05:17.979
I had a chess board that played against

0:05:16.930,0:05:19.960
you like I told you right into the

0:05:17.979,0:05:21.420
pieces right but I’m so bad at just it

0:05:19.960,0:05:23.400
didn’t really need to try that hard

0:05:21.420,0:05:25.470
and then like I guess that’s my point

0:05:23.400,0:05:27.450
I’m in on the same way like I know how

0:05:25.470,0:05:29.730
the pieces move and I know some basic

0:05:27.450,0:05:32.460
strategy but I’m terrible at it

0:05:29.730,0:05:35.730
so it didn’t take much for a basic

0:05:32.460,0:05:39.660
computer to beat me at chess so IBM

0:05:35.730,0:05:41.910
built this entire supercomputer to play

0:05:39.660,0:05:48.290
who was at the time the world champion

0:05:41.910,0:05:52.620
Garry Kasparov back in the late 90s and

0:05:48.290,0:05:55.560
it was simplistic in its ability but it

0:05:52.620,0:05:59.070
was sophisticated in its achievement so

0:05:55.560,0:06:01.700
deep blue could identify the pieces on

0:05:59.070,0:06:03.960
the chessboard and know how each moves

0:06:01.700,0:06:06.000
okay so he’s reached my level of

0:06:03.960,0:06:09.630
intelligence at that point so I can

0:06:06.000,0:06:12.240
identify them and move them deep blue

0:06:09.630,0:06:14.970
can make predictions about what moves

0:06:12.240,0:06:17.970
might be next for it and its opponents

0:06:14.970,0:06:19.920
and this is where really its

0:06:17.970,0:06:22.650
intelligence could go so for instance it

0:06:19.920,0:06:25.710
could analyze the board and it could

0:06:22.650,0:06:27.620
compute every possible permutation of a

0:06:25.710,0:06:30.630
move that could be done at that time

0:06:27.620,0:06:33.360
which is pretty impressive and then it

0:06:30.630,0:06:35.250
could decide what the best move is and

0:06:33.360,0:06:39.150
basically play an entire game out on a

0:06:35.250,0:06:42.030
single move it can choose the most

0:06:39.150,0:06:45.930
optimal moves but it doesn’t have the

0:06:42.030,0:06:48.690
concept of the past like it can’t learn

0:06:45.930,0:06:50.700
from its mistakes it doesn’t have any

0:06:48.690,0:06:55.320
memory of what happened before so it’s

0:06:50.700,0:06:57.150
here now in the moment and and that’s

0:06:55.320,0:07:00.530
really where it’s restrictions were it

0:06:57.150,0:07:03.710
could analyze the current situation

0:07:00.530,0:07:06.660
whereas another example would be

0:07:03.710,0:07:08.970
google’s alphago where it played another

0:07:06.660,0:07:13.350
strategy game called Gao which is a

0:07:08.970,0:07:15.000
Chinese game and go is the Chinese

0:07:13.350,0:07:16.980
equivalent of chess but it’s a

0:07:15.000,0:07:18.570
territorial based game so there’s a lot

0:07:16.980,0:07:19.730
of strategy there’s a lot of forethought

0:07:18.570,0:07:22.440
that goes into it

0:07:19.730,0:07:25.530
there’s sacrifice strategies and so

0:07:22.440,0:07:27.530
forth they’re going to go and it’s

0:07:25.530,0:07:30.930
beating the top go players in the world

0:07:27.530,0:07:33.720
but it can’t evaluate all the potential

0:07:30.930,0:07:36.870
moves that deep blue can so there’s

0:07:33.720,0:07:39.189
different types of technology

0:07:36.870,0:07:41.499
it’s analysis method was more

0:07:39.189,0:07:43.509
sophisticated because it used neural

0:07:41.499,0:07:45.729
networks which is exactly what humans

0:07:43.509,0:07:47.169
have for brains were we can form

0:07:45.729,0:07:51.729
connections in our brains through an

0:07:47.169,0:07:54.430
oral network but it doesn’t have the

0:07:51.729,0:07:56.289
ability to retain or learn or anything

0:07:54.430,0:07:58.990
like that so that’s sort of a limitation

0:07:56.289,0:08:01.629
that we have with reactive machines and

0:07:58.990,0:08:04.599
and they’re still in use today but this

0:08:01.629,0:08:10.180
was sort of the infancy of artificial

0:08:04.599,0:08:12.069
intelligence the there any applications

0:08:10.180,0:08:13.689
that you could think of from a practical

0:08:12.069,0:08:17.800
standpoint that we could use a machine

0:08:13.689,0:08:20.919
like that today maybe not practical but

0:08:17.800,0:08:22.810
I guess for competitive play if you know

0:08:20.919,0:08:24.999
because I know IBM also made Watson for

0:08:22.810,0:08:26.229
Jeopardy right and that did pretty well

0:08:24.999,0:08:28.569
I don’t remember one against Ken

0:08:26.229,0:08:30.430
Jennings and the third player but I know

0:08:28.569,0:08:31.960
it did very well with that kind of

0:08:30.430,0:08:33.820
technology and I imagine it’s probably

0:08:31.960,0:08:35.919
you know something similar where they

0:08:33.820,0:08:37.209
analyzed because jeopardy repeats

0:08:35.919,0:08:38.469
questions so they probably analyzed all

0:08:37.209,0:08:40.870
the questions put it in with memory

0:08:38.469,0:08:43.510
banks or its data banks and then allowed

0:08:40.870,0:08:45.130
it to get strategies from that so I

0:08:43.510,0:08:47.500
could definitely see for competitive

0:08:45.130,0:08:50.410
play like that and also it could

0:08:47.500,0:08:53.110
probably be used to analyze trends in

0:08:50.410,0:08:55.029
these games so like how the best in the

0:08:53.110,0:08:57.190
world the best humans in the world how

0:08:55.029,0:09:00.250
they win and then maybe learn from that

0:08:57.190,0:09:02.380
information but I I might just be

0:09:00.250,0:09:03.850
blanking but I can’t think of any any

0:09:02.380,0:09:07.029
more practical than that and I think

0:09:03.850,0:09:09.279
that’s sort of the challenge that that

0:09:07.029,0:09:12.160
artificial intelligence had in those

0:09:09.279,0:09:14.019
early days was that you couldn’t make

0:09:12.160,0:09:15.519
money off of it yeah like there wasn’t

0:09:14.019,0:09:18.160
something where you could build this

0:09:15.519,0:09:21.699
machine stick it in industry and have it

0:09:18.160,0:09:26.560
produce something so a lot of it was

0:09:21.699,0:09:29.610
confined to college studies the next

0:09:26.560,0:09:33.279
type of machine type to class are

0:09:29.610,0:09:35.649
machines with limited memory so they can

0:09:33.279,0:09:38.889
look into the past and for this we look

0:09:35.649,0:09:41.949
at self-driving cars so self-driving

0:09:38.889,0:09:43.860
cars for example can observe other cars

0:09:41.949,0:09:46.300
speed and direction

0:09:43.860,0:09:49.750
[Music]

0:09:46.300,0:09:51.930
that can’t be done in just one moment so

0:09:49.750,0:09:54.100
they have to look at it in time slices

0:09:51.930,0:09:55.840
they need to be able to identify

0:09:54.100,0:09:59.110
specific objects they need to monitor

0:09:55.840,0:10:02.410
them over time you think of just driving

0:09:59.110,0:10:03.820
down the road you’re looking at objects

0:10:02.410,0:10:05.880
even if you’re just looking at the lines

0:10:03.820,0:10:09.070
in the road and the bends in the road

0:10:05.880,0:10:10.960
you have to look at these over time and

0:10:09.070,0:10:13.810
and there has to be a progression

0:10:10.960,0:10:16.540
sequence that the computer itself can

0:10:13.810,0:10:21.190
understand and then predict what it has

0:10:16.540,0:10:22.960
to do the observations are added to the

0:10:21.190,0:10:25.450
self-driving cars pre-programmed

0:10:22.960,0:10:28.480
representations of the world so you

0:10:25.450,0:10:29.830
figure you have a map overlay then you

0:10:28.480,0:10:32.110
have these observations that are

0:10:29.830,0:10:34.540
overlaid on the map itself so you can do

0:10:32.110,0:10:36.460
obstacle avoidance and things along

0:10:34.540,0:10:39.040
those lines

0:10:36.460,0:10:41.410
these include lane markings traffic

0:10:39.040,0:10:45.010
lights and other important elements like

0:10:41.410,0:10:46.300
curves in the road so now at the type to

0:10:45.010,0:10:49.510
states now you’re getting something

0:10:46.300,0:10:50.680
that’s a little bit more practical at

0:10:49.510,0:10:53.170
this point you have something you can

0:10:50.680,0:10:55.360
sell Tesla loves this technology for

0:10:53.170,0:10:59.020
their cars their autopilot system that

0:10:55.360,0:11:02.440
they have where do you see other

0:10:59.020,0:11:05.500
applications of these limited memory

0:11:02.440,0:11:07.510
type 2a eyes of functioning I think

0:11:05.500,0:11:08.980
definitely the the self-driving cars are

0:11:07.510,0:11:11.290
the beginning of it but I could see that

0:11:08.980,0:11:15.460
technology being migrated to other

0:11:11.290,0:11:17.230
things maybe for military use maybe or I

0:11:15.460,0:11:19.360
think we have a coming up later but

0:11:17.230,0:11:22.540
shipping or not shipping I like trucking

0:11:19.360,0:11:24.520
like big scale trucking where these the

0:11:22.540,0:11:25.600
trucks are automated they’re sent if you

0:11:24.520,0:11:27.400
ever seen the movie Logan

0:11:25.600,0:11:28.900
there’s automated trucks in that movie

0:11:27.400,0:11:31.390
which are they’re kind of terrifying

0:11:28.900,0:11:32.560
because they don’t stop but um I could

0:11:31.390,0:11:34.930
definitely see that being used for

0:11:32.560,0:11:36.760
commerce maybe or for trade

0:11:34.930,0:11:39.040
I’ll way to automate these ships or

0:11:36.760,0:11:41.260
these trucks to move across long

0:11:39.040,0:11:42.880
distances of course that would eliminate

0:11:41.260,0:11:45.220
those jobs but that’s a different

0:11:42.880,0:11:49.150
discussion yeah well and even if you

0:11:45.220,0:11:51.120
look at it from a direct practical

0:11:49.150,0:11:55.270
standpoint that last mile delivery

0:11:51.120,0:11:56.890
you’ve got companies like Amazon that

0:11:55.270,0:11:57.390
are looking to do automated drone

0:11:56.890,0:11:59.100
delivery

0:11:57.390,0:12:00.630
you know you know the guy sitting in a

0:11:59.100,0:12:02.940
truck with a remote control

0:12:00.630,0:12:07.200
you know these drones are being attached

0:12:02.940,0:12:09.420
to packages and they know based on an

0:12:07.200,0:12:11.880
overhead topographical map where the

0:12:09.420,0:12:13.620
addresses are so the artificial

0:12:11.880,0:12:15.240
intelligence there is flying them in

0:12:13.620,0:12:17.310
dropping them off making sure they don’t

0:12:15.240,0:12:18.630
crash in anyone or decapitate anyone

0:12:17.310,0:12:21.839
yeah I think they tried that for people

0:12:18.630,0:12:23.519
like smashed up the drones they did have

0:12:21.839,0:12:26.940
some issues with interactions with

0:12:23.519,0:12:28.560
people they’re where they tend to to

0:12:26.940,0:12:31.649
scare people and you’re not even getting

0:12:28.560,0:12:32.820
to the uncanny valley perspective there

0:12:31.649,0:12:35.519
it’s just people don’t like drones

0:12:32.820,0:12:38.610
flying around their houses so yeah

0:12:35.519,0:12:39.990
that’s that you know the this control

0:12:38.610,0:12:43.050
this automation like if you could have

0:12:39.990,0:12:46.050
an entire fleet of tankers and those

0:12:43.050,0:12:48.630
tankers are computer-controlled you

0:12:46.050,0:12:51.660
could eliminate human error like in the

0:12:48.630,0:12:53.310
example the Exxon Valdez where you can

0:12:51.660,0:12:55.560
avoid a lot of the accidents that you

0:12:53.310,0:12:56.930
run into because the computers are can

0:12:55.560,0:12:59.490
react much faster

0:12:56.930,0:13:02.490
the next form that they talk about is

0:12:59.490,0:13:05.180
where we haven’t gotten to yet and this

0:13:02.490,0:13:07.230
is where we sort of get to that scary

0:13:05.180,0:13:10.589
aspect of things and that’s what they

0:13:07.230,0:13:12.980
call theory of mind so machines in the

0:13:10.589,0:13:15.899
next more advanced class not only form

0:13:12.980,0:13:18.750
representations about the world but also

0:13:15.899,0:13:21.779
about other gadgets and entities in the

0:13:18.750,0:13:23.959
world in psychology it’s called theory

0:13:21.779,0:13:26.760
of mind the understanding that people

0:13:23.959,0:13:28.800
creatures and objects in the world can

0:13:26.760,0:13:31.769
have thoughts and emotions that affect

0:13:28.800,0:13:34.290
their own behavior without understanding

0:13:31.769,0:13:36.120
each other’s motives and intentions and

0:13:34.290,0:13:39.720
without taking into account what

0:13:36.120,0:13:42.959
somebody else knows either more about me

0:13:39.720,0:13:47.339
or the environment working together is

0:13:42.959,0:13:49.490
it best difficult so from a theory of

0:13:47.339,0:13:52.829
mine standpoint we’re talking about

0:13:49.490,0:13:55.800
direct interaction with with robots in

0:13:52.829,0:13:59.149
this case and the example that I could

0:13:55.800,0:14:03.360
think of here is a manufacturing

0:13:59.149,0:14:07.410
situation you’re building cars we’ve got

0:14:03.360,0:14:09.630
robots in in Auto plants now but those

0:14:07.410,0:14:12.600
robots are single purpose robots right

0:14:09.630,0:14:13.829
so they pick something up and they flip

0:14:12.600,0:14:17.300
it and they put it down and that’s all

0:14:13.829,0:14:22.020
they do well this next level here is

0:14:17.300,0:14:24.839
more like if we think of Ironman and how

0:14:22.020,0:14:26.850
Ironman when he’s building his suit he’s

0:14:24.839,0:14:29.790
interacting with the robots around him

0:14:26.850,0:14:31.560
and when the robot does something wrong

0:14:29.790,0:14:35.459
he yells at the robot and the robot

0:14:31.560,0:14:37.500
exhibits some sadness as a result so

0:14:35.459,0:14:40.470
you’re talking about the first level of

0:14:37.500,0:14:43.290
really intelligent machines here where

0:14:40.470,0:14:45.899
the machines can interact with you and

0:14:43.290,0:14:48.959
you kind of get this with your personal

0:14:45.899,0:14:52.949
assistance now your Amazon’s your

0:14:48.959,0:14:54.810
Google’s I even say Syria but series

0:14:52.949,0:14:59.130
literally the absolute worst personal

0:14:54.810,0:15:02.600
assistant out there but they are able to

0:14:59.130,0:15:04.940
react to you they’re able to know

0:15:02.600,0:15:07.709
environmentally depending on what

0:15:04.940,0:15:09.980
aspects of technology having your in

0:15:07.709,0:15:12.480
your home

0:15:09.980,0:15:14.670
Amazon knows that I walk in the door I

0:15:12.480,0:15:17.670
unlocked my door I turn my lights on it

0:15:14.670,0:15:19.529
knows I’m here it can that can trigger a

0:15:17.670,0:15:21.630
reaction of telling me what the weather

0:15:19.529,0:15:25.740
is or starting dinner or turning my air

0:15:21.630,0:15:27.839
conditioner on or something like that so

0:15:25.740,0:15:29.910
you’re getting the point where they’re

0:15:27.839,0:15:31.740
sensing the environment they’re

0:15:29.910,0:15:35.850
interacting with you as an individual

0:15:31.740,0:15:37.680
and then they’re doing something where

0:15:35.850,0:15:39.810
do you think something like that would

0:15:37.680,0:15:42.529
go and what would be a marketable use

0:15:39.810,0:15:45.029
for that well I think it basically turns

0:15:42.529,0:15:46.980
there like personal servants right at

0:15:45.029,0:15:48.899
that point on they just aren’t aware

0:15:46.980,0:15:51.089
that they are right because their only

0:15:48.899,0:15:53.640
purpose it’s kind of it became like a

0:15:51.089,0:15:54.690
joke but there’s a part in I don’t if

0:15:53.640,0:15:57.029
you ever watch Rick and Morty but

0:15:54.690,0:16:00.839
there’s a part where the scientist guy

0:15:57.029,0:16:03.120
creates a robot that is only made to

0:16:00.839,0:16:04.589
give him butter for his toast and the

0:16:03.120,0:16:06.000
robot goes what is my purpose and

0:16:04.589,0:16:08.699
because you pass me butter and he goes

0:16:06.000,0:16:10.800
dear God because he realizes that that’s

0:16:08.699,0:16:12.439
his only purpose in life so you know

0:16:10.800,0:16:14.279
it’s it’s that step before

0:16:12.439,0:16:16.230
self-awareness which is the next step

0:16:14.279,0:16:19.319
we’ll talk about but it’s that purely

0:16:16.230,0:16:21.059
subservient purely you till you till

0:16:19.319,0:16:23.040
atif function right and I think like you

0:16:21.059,0:16:23.970
said a great example is all the personal

0:16:23.040,0:16:25.800
assistants that we have

0:16:23.970,0:16:27.329
and I’m not sure how much of them are

0:16:25.800,0:16:29.250
recognizing you but maybe just

0:16:27.329,0:16:31.139
recognizing inputs or through voice

0:16:29.250,0:16:32.879
commands yes if you have to say you know

0:16:31.139,0:16:35.339
so-and-so terminal lights

0:16:32.879,0:16:36.779
it doesn’t like I could be wrong but it

0:16:35.339,0:16:38.939
doesn’t like see your face to her camera

0:16:36.779,0:16:42.060
recognize you and then no but for

0:16:38.939,0:16:44.459
instance your Amazon devices can

0:16:42.060,0:16:46.709
recognize your voice yeah yeah Google

0:16:44.459,0:16:48.629
does – yeah when I say hey emails

0:16:46.709,0:16:50.970
so-and-so they don’t say well who’s

0:16:48.629,0:16:52.980
emailing from yeah if somebody else

0:16:50.970,0:16:55.160
comes in if it recognizes Michelle’s

0:16:52.980,0:16:58.079
voice and Michelle says hey email Joe

0:16:55.160,0:17:03.120
the email controversial so it knows so

0:16:58.079,0:17:05.490
there is some very basic cognizance of

0:17:03.120,0:17:09.569
who it’s interacting with at that point

0:17:05.490,0:17:11.909
yeah but yeah I agree with you 100%

0:17:09.569,0:17:13.709
there that it’s a limited version but

0:17:11.909,0:17:16.169
it’s definitely there yeah I could

0:17:13.709,0:17:17.520
definitely see that being improved upon

0:17:16.169,0:17:20.429
well I guess it depends how you look at

0:17:17.520,0:17:23.730
it but upgraded to be more intuitive

0:17:20.429,0:17:26.699
where maybe they do have some kind of a

0:17:23.730,0:17:28.799
little bit more um autonomy you know

0:17:26.699,0:17:30.059
with how they can function and how they

0:17:28.799,0:17:31.770
can control different appliances and

0:17:30.059,0:17:33.929
things like that in the future and I go

0:17:31.770,0:17:35.820
back to Tony Stark and Jarvis yeah you

0:17:33.929,0:17:37.110
know that type of thing is is sort of

0:17:35.820,0:17:40.559
the direction that they’re going with

0:17:37.110,0:17:42.510
these personal assistants here and then

0:17:40.559,0:17:46.169
the fourth and final one they talk about

0:17:42.510,0:17:49.559
it’s probably the most controversial I’d

0:17:46.169,0:17:51.360
say and that’s self-awareness the final

0:17:49.559,0:17:54.210
step of AI development is to build

0:17:51.360,0:17:56.940
systems that can form representations

0:17:54.210,0:17:58.980
about themselves consciousness is also

0:17:56.940,0:18:02.130
called self-awareness for a reason I

0:17:58.980,0:18:05.270
want that item is a very different

0:18:02.130,0:18:07.669
statement than I know I want that item

0:18:05.270,0:18:10.620
conscious beings are aware of themselves

0:18:07.669,0:18:13.679
know about their internal States and are

0:18:10.620,0:18:17.100
able to predict feelings of others we

0:18:13.679,0:18:19.110
assume someone honking behind us in

0:18:17.100,0:18:21.480
traffic is angry or impatient because

0:18:19.110,0:18:25.230
that’s how we would feel when we can’t

0:18:21.480,0:18:26.820
get others without theory of mind which

0:18:25.230,0:18:29.090
is the natural progression to this we

0:18:26.820,0:18:33.809
couldn’t make those sorts of inferences

0:18:29.090,0:18:36.330
so where does self-awareness take us let

0:18:33.809,0:18:38.100
me ask you that I mean I personally

0:18:36.330,0:18:39.990
think that that’s where thing

0:18:38.100,0:18:43.080
they get a little bit more scary because

0:18:39.990,0:18:44.130
then it becomes what is I mean the

0:18:43.080,0:18:46.679
obvious answer is the difference between

0:18:44.130,0:18:49.200
us and a self-aware AI are we have blood

0:18:46.679,0:18:51.870
and bones and things like that but on a

0:18:49.200,0:18:53.520
thought level if we function the same

0:18:51.870,0:18:55.800
way you know our brains are essentially

0:18:53.520,0:18:57.840
really really complex computers right so

0:18:55.800,0:18:59.760
if we’re able to create an AI that can

0:18:57.840,0:19:02.340
function like that it can think and can

0:18:59.760,0:19:05.340
store memories and anticipate emotion

0:19:02.340,0:19:06.570
and on some level empathize even you

0:19:05.340,0:19:09.120
know if you can recognize the person

0:19:06.570,0:19:10.500
behind me honking is angry they’re able

0:19:09.120,0:19:13.440
to put themselves in that person’s shoes

0:19:10.500,0:19:15.630
and I think that shows a higher level of

0:19:13.440,0:19:18.000
intelligence that’s equal to humans it’s

0:19:15.630,0:19:20.280
certainly above animals I think so so

0:19:18.000,0:19:22.320
let me expand on that example real quick

0:19:20.280,0:19:24.929
so when someone’s honking behind you do

0:19:22.320,0:19:26.670
you feel empathy towards them no I just

0:19:24.929,0:19:28.380
figure I did something wrong probably so

0:19:26.670,0:19:30.840
I see I don’t get mad at them and that’s

0:19:28.380,0:19:33.240
one way to do it now when I get someone

0:19:30.840,0:19:35.250
honking behind me it annoys me yeah so

0:19:33.240,0:19:39.270
do I really want this artificial

0:19:35.250,0:19:42.300
intelligence controlling a 2,000 pound

0:19:39.270,0:19:43.530
projectile to get angry because

0:19:42.300,0:19:44.670
somebody’s knocking at it well that’s

0:19:43.530,0:19:46.559
the thing right it’s like if we’re

0:19:44.670,0:19:47.610
trying to replicate or maybe we’re not

0:19:46.559,0:19:49.080
intentionally doing this but if we’re

0:19:47.610,0:19:52.350
trying to duplicate the human thought

0:19:49.080,0:19:54.360
process humans are not clearly not

0:19:52.350,0:19:58.440
perfect beings know right and we’re

0:19:54.360,0:20:00.870
prone to our baser desires um so I think

0:19:58.440,0:20:02.790
that is something that could be you know

0:20:00.870,0:20:04.590
recognized to and and especially if AI

0:20:02.790,0:20:06.450
doesn’t have that what humans have like

0:20:04.590,0:20:07.770
a conscious telling you you know a sense

0:20:06.450,0:20:10.830
of morals a sense of what is right and

0:20:07.770,0:20:12.750
wrong the threat of the consequences of

0:20:10.830,0:20:15.030
your actions you know it might it might

0:20:12.750,0:20:17.970
not have anything to stop it from you

0:20:15.030,0:20:19.559
know um acting on those impulses right

0:20:17.970,0:20:23.570
away you know it’s it’s fun of you you

0:20:19.559,0:20:28.350
mentioned that I can’t help but think of

0:20:23.570,0:20:31.590
you know Roman mythology you know we we

0:20:28.350,0:20:34.350
as humans think that were made in God’s

0:20:31.590,0:20:37.290
image and and when you look at it Roman

0:20:34.350,0:20:39.090
mythology you had these gods that were

0:20:37.290,0:20:42.630
cruel

0:20:39.090,0:20:46.340
they were sadistic at times they were

0:20:42.630,0:20:49.980
compassionate at times but they were

0:20:46.340,0:20:50.770
power incarnate and everything that bad

0:20:49.980,0:20:52.600
happened

0:20:50.770,0:20:57.850
they were angry we did something wrong

0:20:52.600,0:21:00.309
and we were really reflecting our own

0:20:57.850,0:21:02.350
nature on them to try to explain things

0:21:00.309,0:21:03.610
yeah and even if you think about it it’s

0:21:02.350,0:21:06.550
almost like a motion based or you have

0:21:03.610,0:21:08.890
the God of War right who is Mars right

0:21:06.550,0:21:11.230
and then that he’s typically portrayed

0:21:08.890,0:21:13.059
as you know angry and primal and and

0:21:11.230,0:21:15.309
then you have you know goddesses of love

0:21:13.059,0:21:16.870
who are lustful and things like that and

0:21:15.309,0:21:20.290
you have like Zoot will not suit Jupiter

0:21:16.870,0:21:21.520
who is very prideful so it’s almost yeah

0:21:20.290,0:21:22.960
like you said a direct reflection of

0:21:21.520,0:21:26.140
each of these emotions being represented

0:21:22.960,0:21:29.110
right and the scary thing the that you

0:21:26.140,0:21:32.080
know we created the gods you know from

0:21:29.110,0:21:35.170
our own means and we’re creating the AIS

0:21:32.080,0:21:37.570
yeah so there’s a very good chance that

0:21:35.170,0:21:42.190
we’re gonna have these AIS reflect the

0:21:37.570,0:21:44.590
same flaws that we do and the same

0:21:42.190,0:21:46.990
virtues that we do hopefully but it’s

0:21:44.590,0:21:50.710
the flaws that scare me yeah you know if

0:21:46.990,0:21:52.630
if I program my AI to react emotionally

0:21:50.710,0:21:55.720
to somebody honking its horn in that

0:21:52.630,0:21:58.690
reaction is the same one that I have I

0:21:55.720,0:22:00.850
have enough self restraint as to know

0:21:58.690,0:22:03.190
that I can’t pull my car over and go

0:22:00.850,0:22:05.640
beat this guy up whereas the car might

0:22:03.190,0:22:08.110
get angry and cut him off you know

0:22:05.640,0:22:09.940
that’s where it gets kind of scary there

0:22:08.110,0:22:12.429
with yourself aware yeah and then if the

0:22:09.940,0:22:14.530
this is a hypothetical but like if the

0:22:12.429,0:22:15.820
ayah has access to information on that

0:22:14.530,0:22:17.400
person they look up their license plate

0:22:15.820,0:22:21.280
number they find their address you know

0:22:17.400,0:22:23.110
you know the old the old car Christine

0:22:21.280,0:22:26.320
you know it’s gonna try and get you you

0:22:23.110,0:22:27.550
know so I wanted to before we moved on

0:22:26.320,0:22:29.920
with the pros and cons and everything I

0:22:27.550,0:22:31.990
thought it was worthwhile to kind of lay

0:22:29.920,0:22:34.840
out the groundwork on what we’re talking

0:22:31.990,0:22:36.130
about with artificial intelligence so

0:22:34.840,0:22:37.690
let’s take a quick break and we’ll come

0:22:36.130,0:22:40.790
back and we’ll look at the pros and cons

0:22:37.690,0:22:48.179
of today’s artificial intelligence

0:22:40.790,0:22:50.799
[Music]

0:22:48.179,0:22:53.320
for over seven years

0:22:50.799,0:22:56.049
the second sif empire has been the

0:22:53.320,0:23:00.100
premier community guild in the online

0:22:56.049,0:23:03.220
game Star Wars The Old Republic with

0:23:00.100,0:23:06.669
hundreds of friendly and helpful active

0:23:03.220,0:23:10.390
members a weekly schedule of knife the

0:23:06.669,0:23:13.179
events annual guild meet and greets and

0:23:10.390,0:23:14.700
an active community both on the web and

0:23:13.179,0:23:18.039
on discord

0:23:14.700,0:23:21.720
the second Civ Empire is more than your

0:23:18.039,0:23:24.750
typical gaming group we’re family

0:23:21.720,0:23:27.860
join us on the star Forge server for

0:23:24.750,0:23:31.919
nightly events such as operations

0:23:27.860,0:23:36.029
flashpoints world boss funds Star Wars

0:23:31.919,0:23:38.070
trivia guild lottery and much more

0:23:36.029,0:23:40.700
visit us on the web today at

0:23:38.070,0:23:40.700
http://www.kencostore.com

0:23:46.130,0:23:53.780
[Music]

0:23:50.240,0:23:57.660
so let’s talk about the pros and cons of

0:23:53.780,0:24:01.440
artificial intelligence being ever the

0:23:57.660,0:24:03.690
optimist that I am not let’s look at the

0:24:01.440,0:24:06.420
pros first so the first one that I have

0:24:03.690,0:24:09.440
here and feel free to throw out your own

0:24:06.420,0:24:13.470
or dispute the ones that I have

0:24:09.440,0:24:15.900
reduction in human error one example

0:24:13.470,0:24:17.760
here is artificial intelligence and

0:24:15.900,0:24:19.530
weather forecasting has significantly

0:24:17.760,0:24:22.170
increased the reliability of weather

0:24:19.530,0:24:23.790
forecasts so instead of being I don’t

0:24:22.170,0:24:28.860
know three percent correct or five

0:24:23.790,0:24:31.910
percent correct now do you think

0:24:28.860,0:24:35.010
reduction in human error is a benefit of

0:24:31.910,0:24:36.330
AI yeah definitely but I don’t know

0:24:35.010,0:24:36.690
sometimes human error can be a good

0:24:36.330,0:24:38.730
thing

0:24:36.690,0:24:40.920
you know sometimes new ideas are born

0:24:38.730,0:24:42.300
out of that or you know so there are two

0:24:40.920,0:24:44.940
sides to it but definitely human error

0:24:42.300,0:24:47.100
when it comes to things like I don’t

0:24:44.940,0:24:48.540
know air traffic control maybe where

0:24:47.100,0:24:50.520
there’s lives at stake you know I think

0:24:48.540,0:24:52.350
those are definitely areas where human

0:24:50.520,0:24:55.650
error if it could be reduced to zero

0:24:52.350,0:24:56.910
would be ideal sure yeah and air traffic

0:24:55.650,0:24:58.830
control is a great example because

0:24:56.910,0:25:01.440
that’s a very high stress it takes a

0:24:58.830,0:25:02.430
great toll on air traffic you know that

0:25:01.440,0:25:04.890
was something else I was gonna bring up

0:25:02.430,0:25:07.590
to it which is kind of with this is the

0:25:04.890,0:25:09.330
human cost of doing a lot of jobs where

0:25:07.590,0:25:12.840
if your people are overworked or

0:25:09.330,0:25:15.360
underpaid replacing them with an AI who

0:25:12.840,0:25:18.330
wouldn’t care or you were not to worry

0:25:15.360,0:25:20.400
about their civil liberties as much I

0:25:18.330,0:25:22.260
guess could be you know a better

0:25:20.400,0:25:24.660
replacement than abusing a workforce for

0:25:22.260,0:25:26.520
sure sure well and that even translates

0:25:24.660,0:25:30.290
into my next point here is the reduced

0:25:26.520,0:25:33.030
risk to human yeah the example here is

0:25:30.290,0:25:35.880
AI robots could be used to clean up

0:25:33.030,0:25:38.400
nuclear and chemical wastes in Chernobyl

0:25:35.880,0:25:41.930
and Fukushima where you don’t have to

0:25:38.400,0:25:45.150
expose humans to that sort of thing

0:25:41.930,0:25:49.560
they’re also available 24/7 so again we

0:25:45.150,0:25:51.510
look at it workers rights here and we’re

0:25:49.560,0:25:54.570
assuming we’re not applying them to AI

0:25:51.510,0:25:58.620
entities so you can make your machines

0:25:54.570,0:26:00.510
work 24/7 until they gain self-awareness

0:25:58.620,0:26:05.700
that’s taking a robot up right until we

0:26:00.510,0:26:07.650
gain self-awareness and unionize digital

0:26:05.700,0:26:10.410
assistance so AI powered customer

0:26:07.650,0:26:13.410
service centers can provide fast

0:26:10.410,0:26:15.050
efficient service on a consistent basis

0:26:13.410,0:26:16.980
have you ever experienced and the AI

0:26:15.050,0:26:19.410
customer service center well especially

0:26:16.980,0:26:21.420
now at the garner buyers most people

0:26:19.410,0:26:24.210
most call centers are not fully staffed

0:26:21.420,0:26:26.040
I had to call I think I ordered food

0:26:24.210,0:26:27.210
from somewhere the other day and I had

0:26:26.040,0:26:29.220
to call and I went through like three

0:26:27.210,0:26:31.230
different robots none of them helped me

0:26:29.220,0:26:32.940
so maybe this is a pro in the future but

0:26:31.230,0:26:35.580
I don’t think it’s a pro right now okay

0:26:32.940,0:26:38.190
so and I and I would agree that you know

0:26:35.580,0:26:41.190
I called I think I had to call the cable

0:26:38.190,0:26:44.550
company for work to get a problem taken

0:26:41.190,0:26:47.250
care of and what took me 15 minutes to

0:26:44.550,0:26:50.340
get to a person I probably would have

0:26:47.250,0:26:52.980
had a problem solve the math on yep okay

0:26:50.340,0:26:55.740
so I’ll buy that how about decision

0:26:52.980,0:26:57.660
making AI are capable of analyzing input

0:26:55.740,0:27:00.540
from a wide variety of sensors and

0:26:57.660,0:27:02.250
making quick decisive choices that can

0:27:00.540,0:27:06.840
mean life and death and things such as

0:27:02.250,0:27:09.450
manufacturing and driving cars you think

0:27:06.840,0:27:12.030
that’s a benefit uh yeah I think it can

0:27:09.450,0:27:14.190
be unless you end up with something like

0:27:12.030,0:27:17.130
the Train dilemma or you have to put one

0:27:14.190,0:27:19.170
life above like 20 and I think the idea

0:27:17.130,0:27:20.309
would obviously choose the one life but

0:27:19.170,0:27:22.800
I don’t know how well that would sell

0:27:20.309,0:27:24.720
who uh you know whoever’s behind that

0:27:22.800,0:27:25.710
well the other 20 would like yes I can

0:27:24.720,0:27:27.840
tell you that yeah

0:27:25.710,0:27:30.929
all right yeah you know any men I meant

0:27:27.840,0:27:32.820
saved the 27 what’s the one but I don’t

0:27:30.929,0:27:34.350
know how well that would go over in the

0:27:32.820,0:27:36.420
press or things like that you know cuz

0:27:34.350,0:27:37.710
there was a a while a long time ago

0:27:36.420,0:27:39.150
there was that self-driving car that

0:27:37.710,0:27:41.070
crashed and killed somebody right yeah

0:27:39.150,0:27:42.360
and people will like lost their minds

0:27:41.070,0:27:43.679
over and that was one and I think it was

0:27:42.360,0:27:45.090
an accident I think it was on the fault

0:27:43.679,0:27:45.960
with a person that got hit I could be

0:27:45.090,0:27:47.309
wrong but I think that they were

0:27:45.960,0:27:48.960
crossing they weren’t supposed to or

0:27:47.309,0:27:53.010
something but still people I think

0:27:48.960,0:27:54.750
already are so wary of this kind of

0:27:53.010,0:27:56.700
technology you know and I think the

0:27:54.750,0:27:58.740
second that any of those decisions if

0:27:56.700,0:28:00.809
they make you know a thousand right ones

0:27:58.740,0:28:02.340
but one wrong one right now I think the

0:28:00.809,0:28:03.840
public out live back I should be dead

0:28:02.340,0:28:05.309
what’s funny you mention it there was a

0:28:03.840,0:28:06.660
news article last week and it was

0:28:05.309,0:28:07.980
somewhere I don’t remember exactly

0:28:06.660,0:28:09.900
wherever it was pretty sure it was

0:28:07.980,0:28:11.760
Eastern Europe where this guy was

0:28:09.900,0:28:12.300
driving in his Tesla Model 3 down the

0:28:11.760,0:28:15.030
highway

0:28:12.300,0:28:17.210
doing 6070 miles an hour had autopilot

0:28:15.030,0:28:21.030
turned on within the passing lane and

0:28:17.210,0:28:27.120
there was a panel van that had flipped

0:28:21.030,0:28:30.570
over was in his Lane and he the

0:28:27.120,0:28:33.030
artificial intelligence saw it but

0:28:30.570,0:28:35.460
because it was in an environment where

0:28:33.030,0:28:38.100
it was traveling at such a speed it

0:28:35.460,0:28:39.600
sensors didn’t pick up the obstacle

0:28:38.100,0:28:42.030
because the obstacle is stationary and

0:28:39.600,0:28:45.450
he’s moving so fast it didn’t pick up

0:28:42.030,0:28:49.320
the obstacle fast enough to break in

0:28:45.450,0:28:50.520
time it tried to break but it wasn’t

0:28:49.320,0:28:53.160
trying to avoid it it was trying to

0:28:50.520,0:28:58.710
break the stomp and slam right into this

0:28:53.160,0:29:00.720
truck so and that’s because there wasn’t

0:28:58.710,0:29:04.470
supposed to be an obstacle so yeah its

0:29:00.720,0:29:07.590
ability to predict an unexpected thing

0:29:04.470,0:29:09.210
like that was compromised by the fact

0:29:07.590,0:29:10.440
that it sensors didn’t reach out far

0:29:09.210,0:29:13.290
enough to know that it was there the guy

0:29:10.440,0:29:14.820
saw it I know it was there and you could

0:29:13.290,0:29:16.560
actually see the smoke for the tires as

0:29:14.820,0:29:19.140
it tried to break but it still says so

0:29:16.560,0:29:21.240
scary there’s no like manual override

0:29:19.140,0:29:22.740
for that look if you hit the brake in

0:29:21.240,0:29:24.450
and Tesla does it just not work I think

0:29:22.740,0:29:26.820
it’s one of these things where the guy

0:29:24.450,0:29:29.850
thought he had enough faith in the

0:29:26.820,0:29:32.040
system no way I would never and I would

0:29:29.850,0:29:34.890
never let it self-driving car self-drive

0:29:32.040,0:29:37.860
me oh yeah no you know I I don’t the

0:29:34.890,0:29:40.560
safest hands are still our own so how

0:29:37.860,0:29:43.830
about enhanced human functions advanced

0:29:40.560,0:29:45.270
AI is capable capable of detecting

0:29:43.830,0:29:48.210
breast cancer which we’ve mentioned

0:29:45.270,0:29:50.130
earlier I can detect them at earlier

0:29:48.210,0:29:53.460
stages than humans so do you think

0:29:50.130,0:29:56.790
something like that where they’re acting

0:29:53.460,0:29:58.530
as a extra set of eyes is a helpful

0:29:56.790,0:30:00.720
thing for us yeah I think that’s stuff

0:29:58.530,0:30:04.710
that’s all fine to me using it as a tool

0:30:00.720,0:30:08.100
a way to enhance our own abilities right

0:30:04.710,0:30:10.230
and even with artificial limbs – all

0:30:08.100,0:30:11.430
that stuff replacing people’s arms and

0:30:10.230,0:30:14.580
legs or whatever they may have lost

0:30:11.430,0:30:17.250
through 3d printing organs that stuff

0:30:14.580,0:30:19.410
excuse me um I hope that’s all wonderful

0:30:17.250,0:30:22.999
so let me flip that particular one

0:30:19.410,0:30:25.169
around in using that same technology

0:30:22.999,0:30:29.700
artificial intelligence is being

0:30:25.169,0:30:32.669
employed in facial recognition so

0:30:29.700,0:30:33.960
there’s a robbery at the local

0:30:32.669,0:30:37.499
convenience store let’s say

0:30:33.960,0:30:39.149
theoretically and there are cameras

0:30:37.499,0:30:40.590
everywhere you know I have cameras

0:30:39.149,0:30:43.710
around the house you have cameras on

0:30:40.590,0:30:45.779
just about every street corner and the

0:30:43.710,0:30:48.509
police come down and they decide that

0:30:45.779,0:30:51.269
they want to do an area warrant where

0:30:48.509,0:30:52.649
they collect all the camera footage they

0:30:51.269,0:30:54.509
go to the cellphone companies they

0:30:52.649,0:30:56.789
collect all the cell phone data of

0:30:54.509,0:30:59.070
people’s phones that bring the area and

0:30:56.789,0:31:02.940
so forth and then they throw all that

0:30:59.070,0:31:05.279
information at an AI and the AI through

0:31:02.940,0:31:08.129
facial recognition cell phone records

0:31:05.279,0:31:09.809
and everything else comes up with a list

0:31:08.129,0:31:12.779
of 30 suspects that were in the area

0:31:09.809,0:31:15.029
with a crime occur and you happen to be

0:31:12.779,0:31:17.820
one of those 30 how would that make you

0:31:15.029,0:31:21.509
feel I mean that that’s where you get to

0:31:17.820,0:31:23.820
like Orwellian levels of and of problems

0:31:21.509,0:31:26.070
yeah and it was it’s actually a real

0:31:23.820,0:31:28.619
issue or let’s it become an issue here

0:31:26.070,0:31:30.720
now too but in Hong Kong during the the

0:31:28.619,0:31:31.769
protests there people had to cover their

0:31:30.720,0:31:33.929
faces cuz there’s cameras everywhere

0:31:31.769,0:31:35.970
same thing in London too because the

0:31:33.929,0:31:37.320
CCTV problem is really really bad there

0:31:35.970,0:31:38.669
oh they’re everywhere there yeah and

0:31:37.320,0:31:41.369
they straight-up admit that we are using

0:31:38.669,0:31:42.690
this to find you if we need to so when

0:31:41.369,0:31:44.190
it’s maybe not a violent crime like a

0:31:42.690,0:31:46.049
burglary but maybe it’s just a protest

0:31:44.190,0:31:47.549
and these people need to be you know are

0:31:46.049,0:31:50.999
being rounded up by the government and

0:31:47.549,0:31:52.379
silence that is terrifying and and

0:31:50.999,0:31:54.720
facial recognition especially because

0:31:52.379,0:31:56.899
you know everything is all your devices

0:31:54.720,0:31:59.909
are constantly listening and watching

0:31:56.899,0:32:03.179
that is terrifying yeah so that’s the

0:31:59.909,0:32:05.549
scary aspect where it can be twisted so

0:32:03.179,0:32:09.359
that leads us into our con yeah what

0:32:05.549,0:32:10.799
what what is bad about AI and the first

0:32:09.359,0:32:13.109
one that I have here is it cost a lot

0:32:10.799,0:32:15.600
it’s it’s the cost of creation

0:32:13.109,0:32:18.090
you know increasingly sophisticated AI

0:32:15.600,0:32:21.659
requires constant improvements in

0:32:18.090,0:32:26.309
hardware and software fortunately we

0:32:21.659,0:32:28.919
have computers that are forever

0:32:26.309,0:32:31.710
increasing in their capacity but the

0:32:28.919,0:32:35.530
costs are so you look at the artificial

0:32:31.710,0:32:38.020
intelligence in your phone 10 years ago

0:32:35.530,0:32:41.050
top align phone cost you about 500 600

0:32:38.020,0:32:43.080
bucks nowadays you’re looking at over a

0:32:41.050,0:32:47.340
thousand bucks is

0:32:43.080,0:32:49.960
artificial-intelligence worth that cost

0:32:47.340,0:32:52.330
for me personally I’m not I’m not sure

0:32:49.960,0:32:53.980
because I don’t use my voice assistant

0:32:52.330,0:32:55.060
or anything like that those aspects of

0:32:53.980,0:32:57.330
artificial intelligence I don’t really

0:32:55.060,0:32:59.550
use it that much but the way it tracks

0:32:57.330,0:33:01.420
weather and things like that or

0:32:59.550,0:33:03.610
emergency alerts that’s all very helpful

0:33:01.420,0:33:05.680
I’m not sure how much about $1,000 for

0:33:03.610,0:33:10.120
the phone is going is because of a I

0:33:05.680,0:33:11.770
think it’s just more how the the

0:33:10.120,0:33:13.420
processing looks like the computing of

0:33:11.770,0:33:14.770
phones nowadays and the cameras that are

0:33:13.420,0:33:18.360
in them I think that’s what drives the

0:33:14.770,0:33:21.160
cost up more and the how supplies are

0:33:18.360,0:33:24.250
gotten to make the phones and how those

0:33:21.160,0:33:27.040
deals are negotiated but yeah I think I

0:33:24.250,0:33:30.280
should be worth or you know I’m okay

0:33:27.040,0:33:34.000
with it being that much okay then let’s

0:33:30.280,0:33:36.130
look at unemployment so the dystopian

0:33:34.000,0:33:39.400
fear that everybody has when you inject

0:33:36.130,0:33:43.150
machines into the workforce is they’re

0:33:39.400,0:33:44.470
gonna replace human jobs and and we’ve

0:33:43.150,0:33:46.570
seen this happen in the auto industry

0:33:44.470,0:33:49.810
and other manufacturing industries just

0:33:46.570,0:33:52.840
with dumb robotic machines do you think

0:33:49.810,0:33:55.120
the threat to unemployment is something

0:33:52.840,0:33:56.860
that’s real for a oh yeah definitely

0:33:55.120,0:33:58.780
unfortunately I think it’s inevitable

0:33:56.860,0:34:01.960
cuz right I mean especially if robots

0:33:58.780,0:34:04.300
end up being cheaper safer the cheaper

0:34:01.960,0:34:07.090
is gonna be the higher thing for

0:34:04.300,0:34:10.810
agribusiness so but if it helps them

0:34:07.090,0:34:13.060
minimize cost and they can also say that

0:34:10.810,0:34:14.860
it maximizes safety because there’s no

0:34:13.060,0:34:16.630
people that are willing to get hurt I

0:34:14.860,0:34:18.399
think it’s it’s just how things are

0:34:16.630,0:34:21.480
gonna go unfortunately well and a

0:34:18.399,0:34:25.630
counter to that would be something like

0:34:21.480,0:34:28.000
let’s say landscaping alright so right

0:34:25.630,0:34:30.399
now the landscapers come to my house

0:34:28.000,0:34:33.419
they’ve got six guys that wheel out

0:34:30.399,0:34:35.740
various machines and they cut my grass

0:34:33.419,0:34:38.409
you could replace those with one

0:34:35.740,0:34:44.290
intelligent lawn mower that can do all

0:34:38.409,0:34:45.580
the work itself you don’t have a lot of

0:34:44.290,0:34:47.590
people that are clamoring for

0:34:45.580,0:34:49.110
landscaping jobs because they don’t

0:34:47.590,0:34:51.190
particularly pay very

0:34:49.110,0:34:53.919
you know they’re really the bottom of

0:34:51.190,0:34:55.780
the line minimum wage type jobs in the

0:34:53.919,0:34:58.240
summer mostly for high school kids and

0:34:55.780,0:34:59.560
stuff like that and a lot of people

0:34:58.240,0:35:02.350
don’t want those jobs so if it’s

0:34:59.560,0:35:05.680
replacing jobs that people don’t want is

0:35:02.350,0:35:07.210
that a good thing I don’t know if it’s a

0:35:05.680,0:35:08.590
good thing because people I mean high

0:35:07.210,0:35:10.540
school isn’t a lot of high schoolers

0:35:08.590,0:35:11.950
need those jobs right to make a little

0:35:10.540,0:35:15.610
extra money on the side or just it was

0:35:11.950,0:35:18.280
hurt saving up for college but again

0:35:15.610,0:35:20.260
that I think that if a company will be

0:35:18.280,0:35:21.700
able to find a cheaper way to do their

0:35:20.260,0:35:24.300
business that they’re always gonna go

0:35:21.700,0:35:27.370
for that regardless of of what they’re

0:35:24.300,0:35:29.530
you know employment issue would be and I

0:35:27.370,0:35:31.320
would tend to agree with that one of the

0:35:29.530,0:35:34.630
things that we looked at I work in a

0:35:31.320,0:35:39.610
manufacturing facility and some of what

0:35:34.630,0:35:41.770
we do is very delicate creation of

0:35:39.610,0:35:45.850
connector parts and stuff like that and

0:35:41.770,0:35:47.800
it requires far more dexterity and skill

0:35:45.850,0:35:49.960
than I have and certainly better vision

0:35:47.800,0:35:52.500
that I have but a lot of that is

0:35:49.960,0:35:55.240
something that could be done by a robot

0:35:52.500,0:35:56.920
so if you come into work and you’re

0:35:55.240,0:36:03.070
doing the same thing over and over every

0:35:56.920,0:36:04.870
day is that something that maybe we take

0:36:03.070,0:36:06.280
five people off the line we put one

0:36:04.870,0:36:08.110
robot in there and then we take one of

0:36:06.280,0:36:10.570
those five people we train them how to

0:36:08.110,0:36:11.710
work the robot yeah I could definitely

0:36:10.570,0:36:13.420
see that I mean that’s what surgeons

0:36:11.710,0:36:16.180
have right where they have really

0:36:13.420,0:36:17.350
complex I don’t know the specific

0:36:16.180,0:36:19.750
because I’m you know I’m not a doctor

0:36:17.350,0:36:21.640
but they have very complex machines that

0:36:19.750,0:36:23.740
allow them to do more intricate work but

0:36:21.640,0:36:25.720
they have surgeons that specialize in

0:36:23.740,0:36:27.730
these machines that are really really

0:36:25.720,0:36:29.950
good at you know I think they even use

0:36:27.730,0:36:31.180
like video game controllers to move the

0:36:29.950,0:36:34.450
parts and things like that so I think

0:36:31.180,0:36:38.980
there is like a mid a middle point

0:36:34.450,0:36:40.990
between full robot employee kind of

0:36:38.980,0:36:41.920
thing versus entirely human I think

0:36:40.990,0:36:43.750
there’s somewhere in between where you

0:36:41.920,0:36:45.610
can kind of merge the two so let’s go

0:36:43.750,0:36:47.530
down that that route there with the

0:36:45.610,0:36:49.180
medical industry so what are the other

0:36:47.530,0:36:50.260
complaints about AI is it’s it’s

0:36:49.180,0:36:53.020
emotionless

0:36:50.260,0:36:55.660
and I think that really comes out more

0:36:53.020,0:36:59.330
in the bedside manner

0:36:55.660,0:37:02.750
in an operating room like

0:36:59.330,0:37:04.850
where the machines right now are there

0:37:02.750,0:37:08.030
to assist doctors they don’t do the work

0:37:04.850,0:37:09.710
the doctors are doing the work but I can

0:37:08.030,0:37:12.200
certainly see a situation in the

0:37:09.710,0:37:16.100
not-too-distant future where a robot is

0:37:12.200,0:37:20.060
doing an operation you know I don’t know

0:37:16.100,0:37:22.970
torn ACL and it’s doing isn’t this you

0:37:20.060,0:37:24.740
know not anything where it’s doing life

0:37:22.970,0:37:28.480
impacting heart surgery or something

0:37:24.740,0:37:31.160
like that but like secondary surgeries

0:37:28.480,0:37:32.750
we you’re not gonna joke with the robot

0:37:31.160,0:37:35.000
while it’s doing it you know it’s not

0:37:32.750,0:37:37.640
gonna worry that it it moved the wrong

0:37:35.000,0:37:40.700
way in and I hurt you do you think that

0:37:37.640,0:37:41.960
lack of emotion in AI is is a concern

0:37:40.700,0:37:43.310
when it comes to things like that

0:37:41.960,0:37:44.990
definitely I think that’s one of the

0:37:43.310,0:37:46.430
biggest obstacles when dealing with AI

0:37:44.990,0:37:48.020
just in general not just even in the

0:37:46.430,0:37:49.720
medical field but I think that’s one of

0:37:48.020,0:37:52.730
people’s biggest hang-ups is that it’s

0:37:49.720,0:37:55.340
humans are really good at recognizing

0:37:52.730,0:37:58.280
other humans and other humans that are

0:37:55.340,0:38:00.440
being genuine and any attempt to fake

0:37:58.280,0:38:02.300
that you can you can detect it right

0:38:00.440,0:38:03.920
away and I think that that’s one of the

0:38:02.300,0:38:05.750
issues because we don’t if we don’t have

0:38:03.920,0:38:07.910
that connection especially when someone

0:38:05.750,0:38:09.890
is you know sticking on a scalpel in us

0:38:07.910,0:38:13.400
or something I think there has to be

0:38:09.890,0:38:14.900
that level of trust that especially for

0:38:13.400,0:38:17.240
something like that that you need and

0:38:14.900,0:38:19.670
even if even if there was a doctor on

0:38:17.240,0:38:22.550
the other side of the operating you know

0:38:19.670,0:38:24.440
the glass whatever directing or being

0:38:22.550,0:38:26.360
like hey it’s fine there’s still a robot

0:38:24.440,0:38:29.540
there you know and I think that people

0:38:26.360,0:38:31.040
would would have an issue with that but

0:38:29.540,0:38:33.260
it’s probably still gonna go that way

0:38:31.040,0:38:35.870
and people will just like I said maybe

0:38:33.260,0:38:40.030
find that midway point or just learn to

0:38:35.870,0:38:42.560
be okay with it I gather to do yeah

0:38:40.030,0:38:44.540
static thinking you know today’s AI

0:38:42.560,0:38:46.310
lacks the ability to think outside the

0:38:44.540,0:38:50.660
box you can’t conceive of the world

0:38:46.310,0:38:53.930
around it and basically it’s confined to

0:38:50.660,0:38:56.090
a very specific program do you think

0:38:53.930,0:39:00.290
that inability let’s let’s think cars

0:38:56.090,0:39:02.780
you know driving cars right now so they

0:39:00.290,0:39:04.910
detect something in the road you know

0:39:02.780,0:39:06.980
you’ve got a car full of passengers you

0:39:04.910,0:39:08.570
got four people in your car their

0:39:06.980,0:39:09.880
sensors tell you that there’s something

0:39:08.570,0:39:13.810
in the road

0:39:09.880,0:39:18.610
and you have cars on on lanes on either

0:39:13.810,0:39:20.830
side of you you can’t stop in time the

0:39:18.610,0:39:23.050
fact that this car knows that there’s

0:39:20.830,0:39:26.050
something there and is willing to slam

0:39:23.050,0:39:28.720
on the brakes to avoid it even if that

0:39:26.050,0:39:30.580
one even if that thing happens to be you

0:39:28.720,0:39:34.690
know a plastic bag that’s blowing across

0:39:30.580,0:39:36.100
the street it could put the passengers

0:39:34.690,0:39:39.670
in danger there do you think that

0:39:36.100,0:39:43.090
ability to cannot analyze objectively

0:39:39.670,0:39:46.660
the surroundings is a detriment to AI

0:39:43.090,0:39:48.280
today yeah yeah definitely and I think

0:39:46.660,0:39:49.570
that’s why you need some kind of human

0:39:48.280,0:39:52.360
oversight to go with all of it right

0:39:49.570,0:39:54.460
until they can get to that level where

0:39:52.360,0:39:57.030
they can store those memories and

0:39:54.460,0:39:59.410
process it and come up with a more I

0:39:57.030,0:40:01.630
guess human will have looking at things

0:39:59.410,0:40:03.160
I think having the human oversight or

0:40:01.630,0:40:06.550
having the ability for the human to

0:40:03.160,0:40:07.210
override that is is almost essential and

0:40:06.550,0:40:09.850
I would agree

0:40:07.210,0:40:11.680
yeah hundred percent and the last one

0:40:09.850,0:40:16.810
that we talked about in the concert is a

0:40:11.680,0:40:22.330
eyes don’t improve over time naturally

0:40:16.810,0:40:24.190
so go back to driving cars okay so you

0:40:22.330,0:40:29.290
commute an hour each way to work every

0:40:24.190,0:40:31.480
day you as a driver are constantly

0:40:29.290,0:40:32.650
learning you’re learning had a drive in

0:40:31.480,0:40:34.240
different weather different lighting

0:40:32.650,0:40:37.150
conditions with different vehicles

0:40:34.240,0:40:38.980
around you the more you drive the better

0:40:37.150,0:40:41.410
the driver you become because of the

0:40:38.980,0:40:46.210
experience of driving being a pilot the

0:40:41.410,0:40:48.700
same way to a car that’s being driven by

0:40:46.210,0:40:50.650
the AI it’s the same thing every day

0:40:48.700,0:40:53.310
it doesn’t learn anything new it doesn’t

0:40:50.650,0:40:56.920
improve its ability it doesn’t store any

0:40:53.310,0:41:00.100
useful experience from that do you think

0:40:56.920,0:41:02.800
that inability for AI is to learn today

0:41:00.100,0:41:04.780
is a detriment of what we expect of them

0:41:02.800,0:41:06.250
yeah and I think that that’s probably

0:41:04.780,0:41:09.670
the next step right that’s that’s

0:41:06.250,0:41:11.740
another stepping stone to self-awareness

0:41:09.670,0:41:13.630
in general because once you start

0:41:11.740,0:41:14.560
learning or once you start remembering

0:41:13.630,0:41:16.210
that’s how you start learning from

0:41:14.560,0:41:18.010
things right so I think once they’re

0:41:16.210,0:41:20.170
able to do that then it’s only a matter

0:41:18.010,0:41:21.340
of time before they’re at least more

0:41:20.170,0:41:22.880
self-aware than they would be

0:41:21.340,0:41:26.090
functioning as they are now

0:41:22.880,0:41:28.190
yeah and and hopefully you know we’ll

0:41:26.090,0:41:31.460
get there and it it won’t be a too high

0:41:28.190,0:41:32.810
a cost so that was all we had for that I

0:41:31.460,0:41:34.610
think we’re gonna come back we’re gonna

0:41:32.810,0:41:38.410
take a look at what artificial

0:41:34.610,0:41:38.410
intelligence looks like in the future

0:41:38.750,0:41:49.220
[Music]

0:41:46.240,0:41:50.930
insights into entertainment a podcast

0:41:49.220,0:41:54.550
series taking a deeper look into

0:41:50.930,0:41:57.020
entertainment and media our

0:41:54.550,0:41:59.450
husband-and-wife team of pop culture

0:41:57.020,0:42:01.880
fanatics are exploring all things from

0:41:59.450,0:42:06.320
music and movies to television and

0:42:01.880,0:42:09.850
fandom we’ll look at the interesting and

0:42:06.320,0:42:12.620
obscure entertainment news of the week

0:42:09.850,0:42:15.050
we’ll talk about theme park and pop

0:42:12.620,0:42:18.050
culture news we’ll give you the latest

0:42:15.050,0:42:20.900
and greatest on pop culture conventions

0:42:18.050,0:42:22.890
we’ll give you a deep dive into Disney

0:42:20.900,0:42:24.940
Star Wars and much more

0:42:22.890,0:42:27.540
[Music]

0:42:24.940,0:42:30.220
check out our video episodes at

0:42:27.540,0:42:33.820
youtube.com backslash insights into

0:42:30.220,0:42:37.030
things our audio episodes at podcast

0:42:33.820,0:42:39.790
insights into entertainment comm or

0:42:37.030,0:42:42.840
check us out on the web at insights into

0:42:39.790,0:42:42.840
things calm

0:42:44.000,0:42:47.039
[Music]

0:42:47.300,0:42:52.040
so the future of artificial intelligence

0:42:52.460,0:42:58.020
again this is sort of a pro’s and con’s

0:42:55.200,0:43:02.040
things but I kind of broke it down in a

0:42:58.020,0:43:05.790
good – bad the ugly that’s just because

0:43:02.040,0:43:08.580
I’m so positive on my outlook for AI so

0:43:05.790,0:43:12.210
the good that I wanted to talk about one

0:43:08.580,0:43:15.660
great example of this is fleets of

0:43:12.210,0:43:18.330
self-driving cars they can replace on

0:43:15.660,0:43:20.130
safe drivers coming from New Jersey we

0:43:18.330,0:43:23.580
know there’s a lot of unsafe drivers in

0:43:20.130,0:43:29.340
our state and they can replace human

0:43:23.580,0:43:32.690
error on the roads how do you feel about

0:43:29.340,0:43:35.430
the prospect of of that you mentioned

0:43:32.690,0:43:36.930
Logan and the self-driving trucks and

0:43:35.430,0:43:38.000
you know I remember the scene where they

0:43:36.930,0:43:40.380
got cut off

0:43:38.000,0:43:42.060
so that doesn’t clearly solve all the

0:43:40.380,0:43:44.460
safety issues what are your thoughts on

0:43:42.060,0:43:46.020
that no I think definitely I think that

0:43:44.460,0:43:47.580
I don’t know I think it’s a double-edged

0:43:46.020,0:43:49.290
sword right because you don’t want to

0:43:47.580,0:43:52.890
eliminate oh they’ll eliminate all those

0:43:49.290,0:43:53.970
jobs because they’re sort of the I don’t

0:43:52.890,0:43:55.350
they always say truckers are like the

0:43:53.970,0:43:57.390
backbone of the economy right because

0:43:55.350,0:43:59.790
once those stop moving and commerce

0:43:57.390,0:44:02.100
stops flowing across the country right

0:43:59.790,0:44:03.660
but I also think that the human error is

0:44:02.100,0:44:04.710
a big part of that as well and I think

0:44:03.660,0:44:07.560
if you can eliminate that then that’s

0:44:04.710,0:44:10.470
that’s good too

0:44:07.560,0:44:12.630
I guess maybe if there was a way to on a

0:44:10.470,0:44:14.790
Tony Stark it where no one’s actually in

0:44:12.630,0:44:17.760
the cab the truck drivers back at you

0:44:14.790,0:44:20.100
know truck driver HQ controlling it you

0:44:17.760,0:44:21.300
know the joystick or something some way

0:44:20.100,0:44:24.120
to do it that way so if there’s still

0:44:21.300,0:44:25.470
that human that human touch so that you

0:44:24.120,0:44:28.170
don’t have a Logan situation where

0:44:25.470,0:44:29.760
they’re just going and not stopping and

0:44:28.170,0:44:30.930
because that’s putting lies at risks to

0:44:29.760,0:44:33.090
so it’s like you’re trading you’re

0:44:30.930,0:44:34.440
trading one risk for another that’s a

0:44:33.090,0:44:38.100
very good point how involved in our

0:44:34.440,0:44:43.350
digital assistance how about if we take

0:44:38.100,0:44:47.790
our Alexa to the next level and we have

0:44:43.350,0:44:49.620
the equivalent of Jarvis in our homes do

0:44:47.790,0:44:52.839
you think that that’s really a direction

0:44:49.620,0:44:55.930
that we’re gonna go and we should go

0:44:52.839,0:44:57.609
no no I don’t cuz I think especially

0:44:55.930,0:44:59.440
because Alexis and your Google’s are all

0:44:57.609,0:45:01.180
controlled by multi-billion dollar

0:44:59.440,0:45:04.420
corporations and I think of those

0:45:01.180,0:45:06.190
corporations were able to develop an AI

0:45:04.420,0:45:08.410
on the level of Jarvis that they would

0:45:06.190,0:45:10.059
absolutely use it for profit they

0:45:08.410,0:45:11.680
already are using Alexis and Google’s

0:45:10.059,0:45:13.119
for profit by constantly listening and

0:45:11.680,0:45:15.819
giving you targeted advertisements and

0:45:13.119,0:45:16.989
things like that so I think the worst

0:45:15.819,0:45:18.759
thing that can happen is for those

0:45:16.989,0:45:19.749
things to get smarter you know because

0:45:18.759,0:45:21.599
then they’re only gonna be able to

0:45:19.749,0:45:24.190
exploit consumers even more and and

0:45:21.599,0:45:27.609
constantly monitor people that’s a very

0:45:24.190,0:45:29.170
good point how about with our public

0:45:27.609,0:45:31.660
transportation systems do you think

0:45:29.170,0:45:33.430
automating our trains and buses would be

0:45:31.660,0:45:35.499
something that would be worthwhile I

0:45:33.430,0:45:38.739
think it’s a similar argument to the

0:45:35.499,0:45:42.999
trucking those jobs are important but

0:45:38.739,0:45:46.180
you also do have fatalities when when

0:45:42.999,0:45:48.009
those workers are overworked or they’re

0:45:46.180,0:45:49.269
not under the right conditions to

0:45:48.009,0:45:50.890
operate what they need to operate but

0:45:49.269,0:45:53.650
they still do because it’s you know they

0:45:50.890,0:45:56.710
have to run so yeah I think I would

0:45:53.650,0:45:58.329
probably take a similar approach to the

0:45:56.710,0:46:02.529
trucking where it’s it’s still human

0:45:58.329,0:46:04.059
operated but maybe a I guided kind or

0:46:02.529,0:46:07.469
the other way around a I operated human

0:46:04.059,0:46:09.579
guided I think entirely having it be AI

0:46:07.469,0:46:11.440
except for maybe like trains because

0:46:09.579,0:46:12.969
they’re on a track all right so it’s not

0:46:11.440,0:46:14.589
like a bus we have to navigate traffic

0:46:12.969,0:46:17.289
but I think you know trains could

0:46:14.589,0:46:19.180
probably work automated and you you make

0:46:17.289,0:46:22.089
a very valid point there we had an

0:46:19.180,0:46:26.950
incident in New Jersey few years back

0:46:22.089,0:46:30.849
where we had a serious accident up in

0:46:26.950,0:46:34.719
North Jersey that led to Amtrak and New

0:46:30.849,0:46:36.940
Jersey Transit rather instituted what

0:46:34.719,0:46:40.420
they call positive train control so if

0:46:36.940,0:46:42.309
if a train is detected as going out of

0:46:40.420,0:46:45.249
control the system itself can bring it

0:46:42.309,0:46:47.109
to a safe stop and prevent accidents so

0:46:45.249,0:46:50.140
that’s really the first phase of this

0:46:47.109,0:46:53.589
automated of public transportation so

0:46:50.140,0:46:55.930
anytime you can bring a multi-thousand

0:46:53.589,0:46:58.900
ton projectile like that safely to a

0:46:55.930,0:47:03.670
stop I think is a good thing but buses I

0:46:58.900,0:47:05.289
agree is you know in New Jersey buses

0:47:03.670,0:47:06.730
have the right away so buses have a

0:47:05.289,0:47:08.170
tendency of not even look

0:47:06.730,0:47:10.690
came to see if something’s coming and

0:47:08.170,0:47:12.880
they pull out I’ve seen several

0:47:10.690,0:47:16.240
accidents with buses and several very

0:47:12.880,0:47:18.250
near accidents with buses so me

0:47:16.240,0:47:20.350
personally driving on the road with

0:47:18.250,0:47:23.109
buses on a regular basis I think that we

0:47:20.350,0:47:27.180
could benefit from some automation when

0:47:23.109,0:47:31.270
it comes to that medical monitoring and

0:47:27.180,0:47:32.970
proactive prevention of illnesses so for

0:47:31.270,0:47:35.800
instance there’s a sensor that I wear

0:47:32.970,0:47:39.609
that allows me to check my blood sugar

0:47:35.800,0:47:43.660
with my phone so I can very easily scan

0:47:39.609,0:47:46.540
that I could totally see AI going to the

0:47:43.660,0:47:49.050
point of instead of me having to

0:47:46.540,0:47:51.310
directly interact with it to scan

0:47:49.050,0:47:54.670
telling me when my blood sugar is too

0:47:51.310,0:47:55.900
high and that I have to take medicine or

0:47:54.670,0:47:57.000
if it’s too low and I have to eat

0:47:55.900,0:48:00.010
something

0:47:57.000,0:48:03.070
having having an artificial intelligence

0:48:00.010,0:48:07.300
help in that respect would directly

0:48:03.070,0:48:10.930
affect me positively do you see other

0:48:07.300,0:48:12.580
elements event being useful I see cuz

0:48:10.930,0:48:14.859
I’m really cynical right so I see it as

0:48:12.580,0:48:16.270
like you have that sensor right but what

0:48:14.859,0:48:18.670
if that Center was designed by a

0:48:16.270,0:48:20.109
pharmaceutical company that needs to

0:48:18.670,0:48:21.670
sell more of their medicine all right

0:48:20.109,0:48:23.530
because they make the medicine too so

0:48:21.670,0:48:25.240
what if they’re on oh I get like maybe

0:48:23.530,0:48:27.310
too far in the weeds with this or like

0:48:25.240,0:48:28.300
maybe they’re encouraging you to take

0:48:27.310,0:48:30.430
one of the medicine so that you then

0:48:28.300,0:48:32.680
have to buy more medicine and and I

0:48:30.430,0:48:34.600
don’t know I think it on the surface it

0:48:32.680,0:48:38.170
is very hopeful and the future of it is

0:48:34.600,0:48:39.910
very optimistic and good because any

0:48:38.170,0:48:42.430
advances in medicine are always a plus

0:48:39.910,0:48:44.080
but I also think that a lot of the

0:48:42.430,0:48:46.510
pharmaceutical companies and companies

0:48:44.080,0:48:48.130
in general do have their own I hate the

0:48:46.510,0:48:50.890
word agenda but their own agenda right

0:48:48.130,0:48:53.380
and if they are able to make profit off

0:48:50.890,0:48:54.700
these things there I would not be

0:48:53.380,0:48:56.290
surprised that there were some backdoors

0:48:54.700,0:48:58.359
that they would use to do them and I

0:48:56.290,0:48:59.650
think that’s a valid concern and I think

0:48:58.359,0:49:02.290
that’s a real risk

0:48:59.650,0:49:05.859
so let’s now that you’ve turned all my

0:49:02.290,0:49:07.420
goods into beds let me throw some beds

0:49:05.859,0:49:12.070
out there and get your reaction

0:49:07.420,0:49:13.450
I’ll turn them into good so one of the

0:49:12.070,0:49:15.970
bad things that I see is that you’re

0:49:13.450,0:49:17.990
gonna have fewer fewer jobs but they’re

0:49:15.970,0:49:19.160
gonna be more specialized

0:49:17.990,0:49:21.890
and you’re gonna have an overall

0:49:19.160,0:49:23.150
reduction to manufacturing jobs it’s not

0:49:21.890,0:49:26.450
a valid concern I mean we’re already

0:49:23.150,0:49:30.050
already seeing that okay how about

0:49:26.450,0:49:33.350
insurance companies using AI to

0:49:30.050,0:49:35.900
determine your validity of your claims

0:49:33.350,0:49:38.119
based on data that the AIS are

0:49:35.900,0:49:40.190
accumulating you mean like like camera

0:49:38.119,0:49:42.770
footage and yeah camera footage sensor

0:49:40.190,0:49:44.750
footage from your car you know if they

0:49:42.770,0:49:46.310
detect that your car was driving too

0:49:44.750,0:49:47.810
fast when you got into an accident or

0:49:46.310,0:49:49.610
that you didn’t brake when you said you

0:49:47.810,0:49:50.960
broke I already have that and Elayne

0:49:49.610,0:49:52.790
like I know my mom she’s to have

0:49:50.960,0:49:54.710
progressive and they put like a thing in

0:49:52.790,0:49:56.000
your steering column that if you were a

0:49:54.710,0:49:58.160
safe driver I don’t know how they

0:49:56.000,0:49:59.600
calculated that but I guess like your

0:49:58.160,0:50:01.280
time to break and speeding up and things

0:49:59.600,0:50:03.500
like that like slamming of rigs up or

0:50:01.280,0:50:05.119
generator yeah it’s a lottery but it

0:50:03.500,0:50:06.680
would affect your rates so I think we’re

0:50:05.119,0:50:08.240
already seeing I mean it’s not an AI

0:50:06.680,0:50:10.010
it’s just like a ticker almost I guess

0:50:08.240,0:50:12.380
but we’re still seeing that data

0:50:10.010,0:50:15.170
collection being used how about the

0:50:12.380,0:50:18.320
general loss of marketable skills as AI

0:50:15.170,0:50:20.840
takes over more mundane jobs I picture a

0:50:18.320,0:50:22.550
future of wall-e were all fat sitting in

0:50:20.840,0:50:24.680
chairs and we can’t even move around

0:50:22.550,0:50:27.080
because they eyes doing everything yeah

0:50:24.680,0:50:29.330
I mean maybe but I think that there

0:50:27.080,0:50:30.950
still will be people that service those

0:50:29.330,0:50:33.109
BOTS I mean until they learn a service

0:50:30.950,0:50:34.250
till then I mean yeah you could just go

0:50:33.109,0:50:36.950
down that rabbit hole right I’ve like

0:50:34.250,0:50:39.680
any job that would require a human to

0:50:36.950,0:50:42.170
fix an AI that that if that AI can then

0:50:39.680,0:50:43.609
just invent that job with another AI you

0:50:42.170,0:50:47.240
know you go further and further down the

0:50:43.609,0:50:49.190
line how about deep fakes they’re

0:50:47.240,0:50:51.950
becoming a real issue analogy this stuff

0:50:49.190,0:50:53.600
is scary it’s incredible how realistic

0:50:51.950,0:50:56.510
it is do you think it’s gonna have an

0:50:53.600,0:50:58.160
impact on 2020 election I think it

0:50:56.510,0:51:00.500
already is I think it did in 2016

0:50:58.160,0:51:02.570
because especially I mean we could

0:51:00.500,0:51:04.130
Twitter and social media in general and

0:51:02.570,0:51:05.570
how that’s used to manipulate people

0:51:04.130,0:51:08.359
that’s a long thick show yeah but I

0:51:05.570,0:51:10.220
think deep fake specifically are I mean

0:51:08.359,0:51:12.470
you see them on like you know well look

0:51:10.220,0:51:14.359
at this wacky deep ache of this funny

0:51:12.470,0:51:16.520
character in a movie but it’s like that

0:51:14.359,0:51:18.730
looks so real and the implications of

0:51:16.520,0:51:21.410
that are terrifying I mean I’ve seen

0:51:18.730,0:51:24.560
just people internet personalities that

0:51:21.410,0:51:26.390
have been not arrested or anything but

0:51:24.560,0:51:29.630
they’ve been deep fakes have been made

0:51:26.390,0:51:31.220
of them of explicit photos that they

0:51:29.630,0:51:33.560
never took that were then released

0:51:31.220,0:51:34.910
taken as fact and that is extremely

0:51:33.560,0:51:37.490
damaging to someone’s life that they had

0:51:34.910,0:51:39.380
nothing to do with that and this

0:51:37.490,0:51:42.079
technology is it’s very very hard to

0:51:39.380,0:51:45.170
tell and it’s only getting better yeah

0:51:42.079,0:51:48.440
you know unfortunately okay so we didn’t

0:51:45.170,0:51:51.470
turn any Bad’s good so since we’ve gone

0:51:48.440,0:51:53.750
bad to worse let’s get to the ugly

0:51:51.470,0:51:55.819
hey we always end this show in a flaming

0:51:53.750,0:51:57.109
pile of sadness so we might as well just

0:51:55.819,0:52:00.050
keep going and that’s the direction

0:51:57.109,0:52:03.230
we’re going today so the first one that

0:52:00.050,0:52:05.930
I have here is want it’s a reality now

0:52:03.230,0:52:09.170
to a certain extent and that’s automated

0:52:05.930,0:52:12.109
military drones deciding when to use

0:52:09.170,0:52:13.910
deadly force yeah a sky that and the

0:52:12.109,0:52:16.579
Terminator yeah so what do you think

0:52:13.910,0:52:17.960
that’s a that’s a real concern yeah and

0:52:16.579,0:52:19.910
I think were already having those

0:52:17.960,0:52:21.560
effects on the people that maybe the

0:52:19.910,0:52:25.130
drones are not entirely unmanned put the

0:52:21.560,0:52:27.170
people that man those drones it’s really

0:52:25.130,0:52:29.200
damaging to their psyche because they

0:52:27.170,0:52:32.750
become disassociated from that violence

0:52:29.200,0:52:34.790
and there’s a show um Jack Ryan the

0:52:32.750,0:52:36.050
second season that or the first one of

0:52:34.790,0:52:40.040
the seasons that show deals with a guy

0:52:36.050,0:52:42.589
that used to operate these drones and he

0:52:40.040,0:52:44.089
hit him and his team because they

0:52:42.589,0:52:46.130
treated like a video game almost and

0:52:44.089,0:52:48.710
then he like he has like a psychological

0:52:46.130,0:52:50.690
break because of it but yeah so I think

0:52:48.710,0:52:53.060
even having the humans deal that now is

0:52:50.690,0:52:55.579
already damaging humans but once that’s

0:52:53.060,0:52:57.319
entirely ai-controlled I don’t know I

0:52:55.579,0:53:00.109
mean I think it it could go both ways

0:52:57.319,0:53:01.940
right because AI if they’re given the

0:53:00.109,0:53:04.760
ability to might be able to better

0:53:01.940,0:53:07.190
distinguish civilians from actual any

0:53:04.760,0:53:08.810
enemy combatants but on the other hand

0:53:07.190,0:53:11.060
if they’re never given those protocols

0:53:08.810,0:53:13.839
they might just you know it just removes

0:53:11.060,0:53:16.579
the empathy at that point that’s true

0:53:13.839,0:53:18.380
okay well that was a little I gave you a

0:53:16.579,0:53:23.240
little bit of a positive you did you and

0:53:18.380,0:53:26.060
then I took it away from Wonder Man so

0:53:23.240,0:53:29.990
traditionally intelligence enables

0:53:26.060,0:53:32.599
control AI that’s more intelligent than

0:53:29.990,0:53:34.640
humans can Institute control over humans

0:53:32.599,0:53:36.500
you think that’s a that’s a possibility

0:53:34.640,0:53:39.190
I mean I think it’s already happening on

0:53:36.500,0:53:43.310
social media right I mean we’re entire

0:53:39.190,0:53:44.210
Twitter movements are being well all

0:53:43.310,0:53:45.380
fairness

0:53:44.210,0:53:47.450
there’s not a lot of human intelligence

0:53:45.380,0:53:50.060
in social media no no but we saw it with

0:53:47.450,0:53:53.570
the 2016 election where body counts were

0:53:50.060,0:53:56.690
able to manufacture ideas that then

0:53:53.570,0:53:58.070
swayed how voting was done so I think in

0:53:56.690,0:53:59.240
a white AR is already controlling it’s

0:53:58.070,0:54:01.310
not to mention like I said before the

0:53:59.240,0:54:03.170
targeted advertisements where if you dig

0:54:01.310,0:54:04.940
up your phone you scream dominoes at it

0:54:03.170,0:54:06.170
for 30 seconds you’re gonna get a Dom in

0:54:04.940,0:54:07.970
his head and I guarantee after the show

0:54:06.170,0:54:09.740
we’re gonna get one on our phone so I

0:54:07.970,0:54:12.260
think in in subtle ways like that when

0:54:09.740,0:54:15.020
it comes to consumerism or political

0:54:12.260,0:54:17.000
ideology AI in a way is already I mean

0:54:15.020,0:54:18.680
the AI is being controlled by you know

0:54:17.000,0:54:20.089
companies or things like that but I

0:54:18.680,0:54:22.730
think that’s already being used to sway

0:54:20.089,0:54:24.230
and tend to hurt and maybe not control

0:54:22.730,0:54:25.250
outright but definitely to herd people

0:54:24.230,0:54:27.859
hundr certain ideas

0:54:25.250,0:54:31.670
okay so I’ll skip the question on

0:54:27.859,0:54:33.830
manipulating the voting system so the

0:54:31.670,0:54:39.560
last one that I had here is is probably

0:54:33.830,0:54:41.180
the one that is the most apocryphal this

0:54:39.560,0:54:43.820
is the one that Stephen Hawking has

0:54:41.180,0:54:48.410
spoken about one when he was still alive

0:54:43.820,0:54:50.930
and that is super intelligent AI making

0:54:48.410,0:54:53.810
humans obsolete but not just making

0:54:50.930,0:54:56.869
humans obsolete realizing that they’re

0:54:53.810,0:54:58.849
obsolete therefore a threat is that

0:54:56.869,0:55:03.320
something that you think we have to fear

0:54:58.849,0:55:05.720
I think so I don’t know what point in

0:55:03.320,0:55:06.980
our history but I think it if you follow

0:55:05.720,0:55:09.230
that line I thought it definitely makes

0:55:06.980,0:55:11.210
sense right if they say I realized that

0:55:09.230,0:55:12.890
because I even humans realized that

0:55:11.210,0:55:14.990
we’re we’re a negative force on

0:55:12.890,0:55:17.270
basically everything on the in terms of

0:55:14.990,0:55:20.089
the planet then we consume we don’t

0:55:17.270,0:55:22.400
really give back that much we in terms

0:55:20.089,0:55:24.230
of input and output our ratios are all

0:55:22.400,0:55:27.349
wrong and I think in AI would be able to

0:55:24.230,0:55:29.330
see that pretty clinically and just make

0:55:27.349,0:55:31.970
these decision that you know we’re doing

0:55:29.330,0:55:35.480
the planet in terms of sustainability a

0:55:31.970,0:55:37.640
favor which you know it’s not a great

0:55:35.480,0:55:38.990
outlook but I don’t think I don’t know

0:55:37.640,0:55:41.540
why they’d see it any other way unless

0:55:38.990,0:55:43.940
they were able to program compassion or

0:55:41.540,0:55:46.369
mercy I think you’ve got a valid point

0:55:43.940,0:55:49.040
there so the one question I do want to

0:55:46.369,0:55:55.010
leave you with and get your reaction on

0:55:49.040,0:55:57.140
is as part of our human evolution do you

0:55:55.010,0:56:01.400
think that AI

0:55:57.140,0:56:05.690
is a tool to lead us down that path or

0:56:01.400,0:56:09.859
do you think it is at best a distraction

0:56:05.690,0:56:14.180
or at worst a detriment to humanity I

0:56:09.859,0:56:16.160
think it’s definitely a tool that allows

0:56:14.180,0:56:17.660
us to elevate what we’re able to do

0:56:16.160,0:56:20.089
through the use of technology just like

0:56:17.660,0:56:22.490
we would do with you know the invention

0:56:20.089,0:56:24.230
of the hammer or the wheel the

0:56:22.490,0:56:26.990
difference is that those things don’t

0:56:24.230,0:56:29.809
have brains right so I think that as you

0:56:26.990,0:56:31.279
follow the progression of how AI become

0:56:29.809,0:56:34.430
more and more sophisticated unless we

0:56:31.279,0:56:36.529
put a cap on it and stop it stop that

0:56:34.430,0:56:41.289
intelligence from evolving too far

0:56:36.529,0:56:41.289
I do think you’ll have even a

0:56:41.470,0:56:46.519
philosophical crisis on your hands of

0:56:43.880,0:56:47.750
what does it mean to be alive and what

0:56:46.519,0:56:50.059
does it mean and countless

0:56:47.750,0:56:52.730
sci-fi movies and books and things I’ve

0:56:50.059,0:56:54.170
covered this before but I think that if

0:56:52.730,0:56:55.069
we allow it to go far enough that’s

0:56:54.170,0:56:57.799
probably what we’re gonna end up with

0:56:55.069,0:56:58.220
and because we’re already the dominant

0:56:57.799,0:57:00.920
species

0:56:58.220,0:57:02.690
it’ll probably will worried react

0:57:00.920,0:57:06.769
negatively to another thing you know

0:57:02.690,0:57:08.359
threatening us okay and I wish I could

0:57:06.769,0:57:11.029
disagree with you but I

0:57:08.359,0:57:13.730
I think I’m along the same lines as you

0:57:11.029,0:57:15.470
are yeah so uh since we’re almost to the

0:57:13.730,0:57:17.779
end of things so we don’t end in a

0:57:15.470,0:57:19.670
burning pile of sadness um a couple

0:57:17.779,0:57:21.470
really great movies that do end books

0:57:19.670,0:57:22.640
that deal with this a I think there’s I

0:57:21.470,0:57:23.170
watched movies recently called ex

0:57:22.640,0:57:26.509
machina

0:57:23.170,0:57:28.490
it’s got Oscar Isaac and domhnall

0:57:26.509,0:57:29.599
gleeson it deals with kind of like what

0:57:28.490,0:57:31.279
I was just talking about we’re like what

0:57:29.599,0:57:35.450
does it mean to be human and things like

0:57:31.279,0:57:37.730
that that’s all Netflix love death and

0:57:35.450,0:57:40.609
robots there’s a it’s a bunch of short

0:57:37.730,0:57:44.559
episodes but some of them deal with that

0:57:40.609,0:57:47.180
as well also this book I brought for you

0:57:44.559,0:57:49.069
robopocalypse it doesn’t have the dust

0:57:47.180,0:57:52.039
jacket but yeah that book is basically

0:57:49.069,0:57:54.230
about it takes place after a robot war

0:57:52.039,0:57:57.380
has happened humanity came out on top

0:57:54.230,0:57:59.809
but then it goes back in time and it

0:57:57.380,0:58:02.599
finds like archival footage almost of

0:57:59.809,0:58:05.839
how that came to be and it gives a

0:58:02.599,0:58:06.920
really realistic look at possibly in the

0:58:05.839,0:58:07.460
near future how something like this

0:58:06.920,0:58:09.799
could happen

0:58:07.460,0:58:10.340
so those concepts interest you at all I

0:58:09.799,0:58:12.200
think those are

0:58:10.340,0:58:13.580
I was checking out so there is fictional

0:58:12.200,0:58:16.070
hope in the future for us

0:58:13.580,0:58:20.420
oh there’s always fictional it can

0:58:16.070,0:58:22.760
ignore everything okay I think that was

0:58:20.420,0:58:25.460
that was all we had for today thank you

0:58:22.760,0:58:28.790
for your time Sam we would invite anyone

0:58:25.460,0:58:33.890
to subscribe to our podcast you can get

0:58:28.790,0:58:36.050
us as insights into tomorrow and for our

0:58:33.890,0:58:38.660
audio podcast our video podcasts you can

0:58:36.050,0:58:42.770
look for insights into things on Apple

0:58:38.660,0:58:46.040
podcast Spotify stitcher Google podcast

0:58:42.770,0:58:48.980
etc we do stream with five days a week

0:58:46.040,0:58:52.250
six days a week on twitch at twitch.tv

0:58:48.980,0:58:55.580
slash insights into things and we would

0:58:52.250,0:58:58.550
welcome comments for / suggestions

0:58:55.580,0:59:00.350
questions anything you have for us for

0:58:58.550,0:59:03.890
the show you can email us at comments at

0:59:00.350,0:59:05.330
insights into things calm anything else

0:59:03.890,0:59:09.140
you want to finish with no I think I

0:59:05.330,0:59:11.000
nailed it all right yeah we’re I’m ready

0:59:09.140,0:59:13.130
to jump out the window now right so all

0:59:11.000,0:59:16.180
right no successful podcasts in the

0:59:13.130,0:59:16.180
books bye

0:59:16.320,0:59:53.010
[Music]

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.