How Is Machine Learning Used in Bitdefender Technologies?

Source –

The terms “artificial intelligence” and “machine learning” are often used interchangeably, but there’s a huge technical difference between them. While the first is used by Hollywood when depicting self-aware machines, the latter is comprised of finely tuned single-task algorithms that are nowhere near self-aware.

In cyber security, machine learning algorithms can learn by themselves to make predictions based on previous experience and from daily analysis of millions of malicious programs. Practically, a machine learning algorithm is trained to identify a new or unknown threat based on similarities with known threats.

For example, feeding a machine learning algorithm with all known variants of the CryptoLocker ransomware family will give it the ability to estimate whether an unknown sample is statistically likely – based on the features it shares with known CryptoLocker samples – to be part of the same ransomware family. The trick is to fine-tune the algorithm to make that assumption as accurate as possible, without causing false alarms, by tagging clean files as malicious.

Detections based on machine learning algorithms are more effective than those that rely on signature-based systems, because they have high detection rates for new malware variants. When implemented in cybersecurity solutions, they can take the fight to the next level and even detect sophisticated threats like APTs.

Revolutionary ideas that grow into breakthrough technologies are what characterize Bitdefender, a company that invests some 25 percent of its yearly budget in researching and developing ambitious security projects.

Bitdefender has a portfolio of 72 patents in areas such as machine learning, anti-spam, anti-phishing, anti-fraud, antimalware, virtualization, BOX-functionality and hardware design, including 42 delivered in the past three years, and 35 under examination. Ten percent of the patents apply to machine learning in malware detection and online threats, deep learning and anomaly-based detection techniques, strengthening Bitdefender’s thought leadership positioning globally.

Since 2009, the development and training of machine learning algorithms has been a key focus for Bitdefender Laboratories, proving extremely effective in detecting threats in a sophisticated, modern threat landscape.

The experience of working with machine learning algorithms to detect new and unknown malware samples has substantially improved detection rates and reduced false positives. For Bitdefender, machine learning has proven the best method in data analysis, polymorphic and generic malware detection, among others.

Although machine learning algorithms can replace humans in the analysis of large amounts of data, it is not a universal security solution, as it needs to be backed up by other technologies to be efficient. For example, with ransomware, detection is made by more than one algorithm, each trained in detecting malware that’s specific to a malware type of family.

A machine learning algorithm can be tailored to a client’s needs, but it has to be developed to deliver fewer false positives, especially if user behavior is difficult to predict. For example, in this case, neural networks or cloud-based detection backed up by machine learning and genetic algorithms can be implemented. In case of predictable behavior, anomaly detection methods can also be deployed. Since machine learning is a generic term for describing algorithms capable of automatically parsing data and extracting common features, the number of algorithms used as well as their function is highly diverse.

Bitdefender doesn’t fully rely on machine learning technology for detection, instead opting for a layered approach. Machine learning is an indispensable part of our technology security stack, not only by proactively and accurately identifying new and unknown threats, but also by augmenting the detection capabilities of those security technologies.

0 0 votes
Article Rating
Notify of
Newest Most Voted
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x
Artificial Intelligence Universe