ARTIFICIAL INTELLIGENCE CAN BE EXPLOITED TO HACK CONNECTED VEHICLES

Source: analyticsinsight.net

AI and ML can be used to conduct Cyber-attacks against Autonomous Cars

Innovative automakers, software developers and tech companies are transforming the automotive industry. Today, drivers enjoy enhanced entertainment, information options and connection with the outer world. As cars move toward more autonomous capabilities, the stakes are increasing in terms of security. As per a report by the UN, Europol and cybersecurity company Trend Micro, cyber-criminals could exploit disruptive technologies, including artificial intelligence (AI) and machine learning (ML) to conduct attacks against autonomous cars, drones and IoT-connected vehicles.

The rapid increase in these technologies inevitably creates a rich target for hackers looking to get access to personal information and control the essential automotive functions and features. The possibility to access information on driver habits for both commercial and criminal purposes, without knowledge and consent, means attitudes towards prevention, understanding and response to potential cyber-attacks require changing.

For instance, stealing personally identifiable information comes into sharper focus when considering virtually all new vehicles on the road today come with embedded, tethered or smartphone mirroring capabilities. Geolocation, personal trip history, and financial details are some examples of personal information that can potentially be stolen through a vehicle’s system using AI and ML.

How Cybercriminals Attack Connected Vehicles

Cybercriminals could conduct attacks abusing machine learning. The technologies are evolving so fast that today autonomous vehicles have ML implemented in them to recognise the environment around them and obstacles like pedestrians must be avoided.

However, these algorithms are still evolving, and hackers could exploit them for malicious purposes, to aid crime or create chaos. For instance, AI systems that manage autonomous vehicles and regular traffic could be manipulated by cybercriminals if they gain access to the networks that control them.

Understanding the threats to connected cars requires knowledge of what cybercriminals are trying to achieve. Hackers will try out different kinds of attacks to achieve unique goals. The most dangerous objective might be to bypass controls in crucial safety systems like steering, brakes and transmission. But cybercriminals might also be interested in obtaining valuable pieces of data that are managed within the car software like personal details and performance statistics. Wherein data can be protected with cryptography, this only shifts the problems from preventing data directly to protecting the cryptographic keys.

If the cybercriminal is trying to steal sensitive data like cryptographic keys, they have to know where to search for them. It usually involves a plethora of reverse-engineering techniques. For instance, the hacker might introduce faults into the compiled code to see how it breaks. Or the individual might look for a string corresponding to an error message related to ‘engine failure’ or ‘anti-lock brake system disabled,’ and trace where that string is used. The individual leverages sophisticated AI techniques to understand the overall structure of the code, where the functions are located.

On the other side, physical access to a device means bad actors can tamper with the application itself. The way this is often done is by making one small change to the application code so it can be bypassed in any number of ways, generally at the assembly language level like inverting the logic of a conditional jump, replacing the test with a tautology or changing function calls to those of the attacker’s own design.

It’s not just road vehicles that cybercriminals could hack by exploiting new technologies such as AI and ML algorithms and increased connectivity; there’s the potential for attackers to abuse machine learning to impact airspace too. Attackers might also consider autonomous drones because they have the potential to carry ‘interesting’ payloads like intellectual property.

Hacking autonomous drones also provide cybercriminals with a potentially easy route to making money by hijacking delivery drones used by retailers and redirecting them to a new location- taking the package and selling it on them.

Related Posts

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence