US9519781B2 - Systems and methods for virtualization and emulation assisted malware detection - Google Patents

Systems and methods for virtualization and emulation assisted malware detection Download PDF

Info

Publication number
US9519781B2
US9519781B2 US13/288,905 US201113288905A US9519781B2 US 9519781 B2 US9519781 B2 US 9519781B2 US 201113288905 A US201113288905 A US 201113288905A US 9519781 B2 US9519781 B2 US 9519781B2
Authority
US
United States
Prior art keywords
virtualization
environment
operations
data
virtualization environments
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US13/288,905
Other versions
US20130117848A1 (en
Inventor
Ali Golshan
James S. Binder
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cyphort Inc
Original Assignee
Cyphort Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cyphort Inc filed Critical Cyphort Inc
Priority to US13/288,905 priority Critical patent/US9519781B2/en
Priority claimed from US13/288,917 external-priority patent/US9792430B2/en
Assigned to CYPHORT INC. reassignment CYPHORT INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BINDER, JAMES S., GOLSHAN, ALI
Priority to PCT/US2012/063569 priority patent/WO2013067508A1/en
Priority to EP12844780.2A priority patent/EP2774038B1/en
Priority to PCT/US2012/063566 priority patent/WO2013067505A1/en
Priority to CA2854182A priority patent/CA2854182A1/en
Priority to EP16167215.9A priority patent/EP3093762B1/en
Priority to CA2854183A priority patent/CA2854183A1/en
Priority to EP12845692.8A priority patent/EP2774039B1/en
Publication of US20130117848A1 publication Critical patent/US20130117848A1/en
Priority to US14/629,444 priority patent/US9686293B2/en
Publication of US9519781B2 publication Critical patent/US9519781B2/en
Application granted granted Critical
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/56Computer malware detection or handling, e.g. anti-virus arrangements
    • G06F21/566Dynamic detection, i.e. detection performed at run-time, e.g. emulation, suspicious activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/52Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow
    • G06F21/53Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems during program execution, e.g. stack integrity ; Preventing unwanted data erasure; Buffer overflow by executing in a restricted environment, e.g. sandbox or secure virtual machine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/552Detecting local intrusion or implementing counter-measures involving long-term monitoring or reporting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/554Detecting local intrusion or implementing counter-measures involving event detection and direct action
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • G06F21/56Computer malware detection or handling, e.g. anti-virus arrangements
    • G06F21/567Computer malware detection or handling, e.g. anti-virus arrangements using dedicated hardware
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1416Event detection, e.g. attack signature detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1425Traffic logging, e.g. anomaly detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1441Countermeasures against malicious traffic
    • H04L63/145Countermeasures against malicious traffic the attack involving the propagation of malware through the network, e.g. viruses, trojans or worms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45558Hypervisor-specific management and integration aspects
    • G06F2009/45587Isolation or security of virtual machine instances
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines
    • G06F9/45533Hypervisors; Virtual machine monitors
    • G06F9/45541Bare-metal, i.e. hypervisor runs directly on hardware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2463/00Additional details relating to network architectures or network communication protocols for network security covered by H04L63/00
    • H04L2463/144Detection or countermeasures against botnets

Definitions

  • the present invention(s) generally relate to malware detection. More particularly, the invention(s) relate to systems and methods for virtualization and emulation assisted malware detection.
  • Cyber-criminals conduct methodical reconnaissance of potential victims to identify traffic patterns and existing defenses.
  • Very sophisticated attacks involve multiple “agents” that individually appear to be legitimate traffic, then remain persistent in the target's network. The arrival of other agents may also be undetected, but when all are in the target network, these agents can work together to compromise security and steal targeted information.
  • Legacy security solutions use a structured process (e.g., signature and heuristics matching) or analyze agent behavior in an isolated context, without the ability to detect future coordinated activity. As a result, legacy security solutions are not able to detect sophisticated malware that is armored, component based, and/or includes different forms of delayed execution.
  • a method comprises intercepting an object provided from a first digital device to a second digital device, determining one or more resources the object requires when the object is executed, instantiating a virtual environment with the one or more resources, processing the object within the virtual environment, tainting operations of the object within the virtual environment, monitoring the operations of the object while processing within the virtual environment, identifying an additional resource of the object while processing that is not provided in the virtual environment, re-instantiating the virtual environment with the additional resource as well as the one or more resources, monitoring the operations of the object while processing within the re-instantiated virtual environment, identifying untrusted actions from the monitored operations, and generating a report identifying the operations and the untrusted actions of the object.
  • the object may comprise an executable file, a batch file, or a data file.
  • the method may further comprise performing a heuristic process on the object and determining the one or more resources the object requires based on the result of the heuristic process. Determining the one or more resources the object requires may be based on metadata associated with the object.
  • the one or more resources may include one or more applications.
  • Generating the report identifying the operations and the untrusted actions of the object may comprise generating a signature to be used to detect malware.
  • generating the report identifying the operations and the untrusted actions of the object may comprise identifying a vulnerability in an application based on the operations and the untrusted actions of the object.
  • the method may further comprise increasing or decreasing a frequency of a clock signal within the virtual environment.
  • the method may comprise logging a state of the virtual environment while monitoring the operations of the object. Further, re-instantiating the virtual environment with the additional resource as well as the one or more resources may comprise halting the virtual environment and re-instantiating the virtual environment with the logged state.
  • An exemplary system may comprise a collection module, a virtualization module, a control module, and a report module.
  • the collection module may be configured to receive an object provided from a first digital device to a second digital device.
  • the virtualization module may be configured to instantiate a virtual environment with the one or more resources, to process the object within the virtual environment, to identify an additional resource of the object while processing that is not provided in the virtual environment, to re-instantiate the virtual environment with the additional resource as well as the one or more resources, and to taint operations of the object within the virtual environment.
  • the control module may be configured to determine one or more resources the object requires when the object is processed, to monitor the operations of the object while processing within the virtual environment, to monitor the operations of the object while processing within the re-instantiated virtual environment, and to identify untrusted actions from the monitored operations.
  • the report module may be configured to generate a report identifying the operations and the untrusted actions of the object.
  • An exemplary computer readable medium may comprise instructions.
  • the instructions may be executable by a processor for performing a method.
  • the method may comprise intercepting an object provided from a first digital device to a second digital device, determining one or more resources the object requires when the object is executed, instantiating a virtual environment with the one or more resources, processing the object within the virtual environment, tainting operations of the object within the virtual environment, monitoring the operations of the object while processing within the virtual environment, identifying an additional resource of the object while processing that is not provided in the virtual environment, re-instantiating the virtual environment with the additional resource as well as the one or more resources, monitoring the operations of the object while processing within the re-instantiated virtual environment, identifying untrusted actions from the monitored operations, and generating a report identifying the operations and the untrusted actions of the object.
  • a method comprises intercepting an object provided from a first digital device to a second digital device, instantiating a virtualization environment with the one or more resources, processing the object within the virtualization environment, tracing operations of the object while processing within the virtualization environment, detecting suspicious behavior associated with the object in the virtualization environment, instantiating an emulation environment in response to the detected suspicious behavior, processing the object within the emulation environment, recording responses to the object within the emulation environment, tracing operations of the object while processing within the emulation environment, detecting a divergence between the traced operations of the object within the virtualization environment and the traced operations of the object within the emulation environment, re-instantiating the virtualization environment in response to the detected divergence, providing the recorded response from the emulation environment to the object in the re-instantiated virtualization environment, monitoring the operations of the object while processing within the re-instantiation of the virtualization
  • the suspicious behavior comprises the object loading data into memory within the virtualization environment but not utilizing the data, the object scanning locations in memory of the virtualization environment and then terminating operations, or the object abruptly halting operations.
  • Trace capturing may be performed in a kernel of a digital device hosting the emulation environment.
  • the method may further comprise increasing or decreasing a frequency of a clock signal within the emulation environment.
  • Re-instantiating the virtualization environment in response to the detected divergence may comprise instantiating a modified image of the virtualization environment.
  • Re-instantiating the virtualization environment in response to the detected divergence may comprise halting the virtualization environment and restarting the virtualization environment.
  • the method may further comprise applying state information from the emulation environment to the re-instantiated virtualization environment.
  • the virtualization environment may be re-instantiated at a point in time where divergence is detected between the virtualization environment and the emulation environment.
  • An exemplary system may comprise a collection module, a virtualization module, an emulation module, and a control module.
  • the collection module may be configured to receive an object provided from a first digital device to a second digital device.
  • the virtualization module may be configured to instantiate a virtualization environment with the one or more resources, to process the object within the virtualization environment, to trace operations of the object while processing within the virtualization environment, to detect suspicious behavior associated with the object in the virtualization environment, to monitor the operations of the object while processing within a re-instantiation of the virtualization environment, to identify untrusted actions from the monitored operations, and to generate a report regarding the identified untrusted actions of the object.
  • the emulation module may be configured to instantiate an emulation environment in response to the detected suspicious behavior, to process the object within the emulation environment, to record responses to the object within the emulation environment and to trace operations of the object while processing within the emulation environment.
  • the control module may be configured to detect a divergence between the traced operations of the object within the virtualization environment and the traced operations of the object within the emulation environment, to re-instantiate the virtualization environment in response to the detected divergence, and to provide the recorded response from the emulation environment to the object in the virtualization environment.
  • An exemplary computer readable medium may comprise instructions.
  • the instructions may be executable by a processor for performing a method.
  • the method may comprise intercepting an object provided from a first digital device to a second digital device, instantiating a virtualization environment with the one or more resources, processing the object within the virtualization environment, tracing operations of the object while processing within the virtualization environment, detecting suspicious behavior associated with the object in the virtualization environment, instantiating an emulation environment in response to the detected suspicious behavior, processing the object within the emulation environment, recording responses to the object within the emulation environment, tracing operations of the object while processing within the emulation environment, detecting a divergence between the traced operations of the object within the virtualization environment to the traced operations of the object within the emulation environment, re-instantiating the virtualization environment in response to the detected divergence, providing the recorded response from the emulation environment and the object in the virtualization environment, monitoring the operations of the object while processing within the re-instanti
  • FIG. 1 is a diagram of an environment in which some embodiments may be practiced.
  • FIG. 2 is a flow diagram of an exemplary process for detection of malware and subsequent reporting in some embodiments.
  • FIG. 3 is a block diagram of an exemplary security server in some embodiments.
  • FIG. 4 is a conceptual block diagram of a virtualization module in some embodiments.
  • FIG. 5 is a block diagram of an exemplary virtualization module in some embodiments.
  • FIG. 6 is an exemplary virtualization environment for detection of malware in some embodiments.
  • FIG. 7 is a flow diagram of an exemplary malware detection method.
  • FIG. 8 is a flow diagram of an exemplary method of controlling a virtualization environment to detect malware.
  • FIG. 9 is a flow diagram of an exemplary model to detect malware through multiple virtualization environments.
  • FIG. 10 is a block diagram of an exemplary digital device.
  • FIG. 11 is a conceptual block diagram of an emulation environment in some embodiments.
  • FIG. 12 is a block diagram of an exemplary emulation module in some embodiments.
  • FIG. 13 is a flow diagram of an exemplary malware detection method utilizing an emulation environment in some embodiments.
  • FIG. 14 is an exemplary emulation environment for detection of malware in some embodiments.
  • FIG. 15 is a trace diagram of operations by or for an object in an emulation environment in some embodiments.
  • FIG. 16 is a block diagram of divergence detection between a virtualization environment and an emulation environment in some embodiments.
  • FIG. 17 is an exemplary process for a hierarchical reasoning engine (HRE) in some embodiments.
  • HRE hierarchical reasoning engine
  • Some embodiments of systems and methods described herein describe appliance-based solutions to protect enterprises, governments, and cloud infrastructures against targeted sophisticated attacks with corporate espionage or possibly cyber warfare objectives. By watching patterns of abnormal traffic, various systems and methods described herein may predict interactions, identify vulnerabilities, and predictably deny particular protocols, data, or network paths to developing malware.
  • An exemplary system comprises a heuristics engine, an instrumented execution infrastructure, and an intelligent engine.
  • the heuristics engine may identify payloads that require further static and dynamic analysis.
  • the dynamic and instrumented execution infrastructure may combine both virtualization and emulation environments.
  • the environments may be constantly updated dynamically to enable “suspect” traffic to execute to its fullest extent through divergence detection and distributed interaction correlation.
  • the intelligent engine may exchange and cross-reference data between “on the fly” spawned virtual environments and emulated environments allowing, for example, the implementation of such resources as modified nested page tables.
  • the virtualization environment may recreate all or part of the end-user environment as well as a fully optimized environment to extract the full execution and behavior of potential malware.
  • a contextual environment may also be created to allow analysis of targeted malware built with armoring capabilities such as anti-virtualization, or anti-debugging technologies.
  • FIG. 1 is a diagram of an environment 100 in which some embodiments may be practiced.
  • Systems and methods embodied in the environment 100 may detect malicious activity, identify malware, identify exploits, take preventive action, generate signatures, generate reports, determine malicious behavior, determine targeted information, recommend steps to prevent attack, and/or provide recommendations to improve security.
  • the environment 100 comprises a data center network 102 and a production network 104 that communicate over a communication network 106 .
  • the data center network 102 comprises a security server 108 .
  • the production network 104 comprises a plurality of end user devices 110 .
  • the security server 108 and the end user devices 110 may comprise digital devices.
  • a digital device is any device with a processor and memory. An embodiment of a digital device is depicted in FIG. 10 .
  • the security server 108 is a digital device configured to identify malware and/or suspicious behavior by running virtualized and emulated environments and monitoring behavior of suspicious data within the virtualized and emulated environments.
  • the security server 108 receives suspicious data from one or more data collectors.
  • the data collectors may be resident within or in communication with network devices such as Intrusion Prevention System (IPS) collectors 112 a and 112 b , firewalls 114 a and 114 b , Internet content adaptation protocol/web cache communication protocol (ICAP/WCCP) collectors 116 , milter mail plug-in collectors 118 , switch collectors 120 , and/or access points 124 .
  • IPS Intrusion Prevention System
  • ICAP/WCCP Internet content adaptation protocol/web cache communication protocol
  • milter mail plug-in collectors 118 milter mail plug-in collectors 118
  • switch collectors 120 switch collectors 120
  • access points 124 access points
  • data collectors may be at one or more points within the communication network 106 .
  • a data collector which may include a test access point (TAP) or switch port analyzer (SPAN) port (e.g., SPAN port/IDS at switch 120 ) for example, is configured to intercept network data from a network.
  • the data collector may be configured to identify suspicious data. Suspicious data is any data collected by the data collector that has been flagged as suspicious by the data collector and/or any data that is to be processed within the virtualization environment.
  • the data collectors may filter the data before flagging the data as suspicious and/or providing the collected data to the security server 108 .
  • the data collectors may filter out plain text but collect executables or batches.
  • the data collectors may perform intelligent collecting. For example, data may be hashed and compared to a whitelist.
  • the whitelist may identify data that is safe.
  • the whitelist may identify digitally signed data or data received from a known trusted source as safe. Further, the whitelist may identify previously received information that has been determined to be safe. If data has been previously received, tested within the environments, and determined to be sufficiently trustworthy, the data collector may allow the data to continue through the network.
  • the data collectors may be updated by the security server 108 to help the data collectors recognize sufficiently trustworthy data and to take corrective action (e.g., quarantine and alert an administrator) if untrustworthy data is recognized. In some embodiments, if data is not identified as safe, the data collectors may flag the data as suspicious for further assessment.
  • one or more agents or other modules may monitor network traffic for common behaviors and may configure a data collector to collect data when data is directed in a manner that falls outside normal parameters.
  • the agent may determine or be configured to appreciate that a computer has been deactivated, a particular computer does not typically receive any data, or data received by a particular computer typically comes from a limited number of sources. If data is directed to a digital device in a manner that is not typical, the data collector may flag such data as suspicious and provide the suspicious data to the security server 108 .
  • Network devices include any device configured to receive and provide data over a network. Examples of network devices include, but are not limited to, routers, bridges, security appliances, firewalls, web servers, mail servers, wireless access points (e.g., hotspots), and switches. In some embodiments, network devices include IPS collectors 112 a and 112 b , firewalls 114 a and 114 b , ICAP/WCCP servers 116 , devices including milter mail plug-ins 118 , switches 120 , and/or access points 124 .
  • the IPS collectors 112 a and 112 b may include any anti-malware device including IPS systems, intrusion detection and prevention systems (IDPS), or any other kind of network security appliances.
  • IDP intrusion detection and prevention systems
  • the firewalls 114 a and 114 b may include software and/or hardware firewalls. In some embodiments, the firewalls 114 a and 114 b may be embodied within routers, access points, servers (e.g., web servers), or appliances.
  • ICAP/WCCP servers 116 include any web server or web proxy server configured to allow access to a network and/or the Internet.
  • Network devices including milter mail plug-ins 118 may include any mail server or device that provides mail and/or filtering functions and may include digital devices that implement milter, mail transfer agents (MTAs), sendmail, and postfix, for example.
  • MTAs mail transfer agents
  • Switches 120 include any switch or router.
  • the data collector may be implemented as a TAP, SPAN port, and/or intrusion detection system (IDS).
  • Access points 124 include any device configured to provide wireless connectivity with one or more other digital devices.
  • the production network 104 is any network that allows one or more end user devices 110 to communicate over the communication network 106 .
  • the communication network 106 is any network that may carry data (encoded, compressed, and/or otherwise) from one digital device to another.
  • the communication network 106 may comprise a LAN and/or WAN. Further, the communication network 106 may comprise any number of networks. In some embodiments, the communication network 106 is the Internet.
  • FIG. 1 is exemplary and does not limit systems and methods described herein to the use of only those technologies depicted.
  • data collectors may be implemented in any web or web proxy server and is not limited to only the servers that implement ICAP and/or WCCP.
  • collectors may be implemented in any mail server and is not limited to mail servers that implement milter.
  • Data collectors may be implemented at any point in one or more networks.
  • FIG. 1 depicts a limited number of digital devices, collectors, routers, access points, and firewalls, there may be any kind and number of devices.
  • security servers 108 there may be any number of security servers 108 , end user devices 110 , IPS collectors 112 a and 112 b , firewalls 114 a and 114 b , ICAP/WCCP collectors 116 , milter mail plug-ins 118 , switches 120 , and/or access points 124 .
  • ICAP/WCCP collectors 116 ICAP/WCCP collectors 116 , milter mail plug-ins 118 , switches 120 , and/or access points 124 .
  • data center networks 102 and/or production networks 104 there may be any number of data center networks 102 and/or production networks 104 .
  • FIG. 2 is a flow diagram of an exemplary process 200 for detection of malware and subsequent reporting in some embodiments.
  • suspect traffic is identified.
  • any network device may be used to monitor and/or collect network traffic for further assessment.
  • the network device and/or another digital device e.g., the security server 108
  • applies heuristics and/or rules e.g., comparison of data to a whitelist and/or a blacklist
  • any technique may be used to flag network traffic as suspicious.
  • the security server 108 may flag data as suspicious if the data is directed towards a known infected computer, a disabled account, or any untrustworthy destination.
  • the security server 108 may flag data as suspicious if the data came from a suspected source of malware or a source that is known to be untrustworthy (e.g., a previously identified botnet server).
  • the data collector and/or agent associated with the data collector may perform packet analysis to identify suspicious characteristics in the collected data including the header, footer, destination IP, origin IP, payload and the like.
  • suspect data and/or suspect processes are tested in one or more virtualization environments for “out of context” behavior analysis of the suspicious data and suspect processes.
  • the suspect data and/or processes are initially virtualized in a set of virtualization environments.
  • Each different virtualization environment may be provisioned differently (e.g., each different virtualization environment may comprise different resources).
  • the initial set of resources for a virtualization environment may be predetermined based on common resources required for processing the data and/or metadata associated with the data. If the suspect data and/or suspect process are determined to be behaving suspiciously in the virtualization environment, the suspect data and/or process may also be processed in an emulation environment as discussed here.
  • the suspect data and/or process is analyzed with multiple virtualization environments to extend predictive analysis to distributed and application interactions as described further herein.
  • the suspect data and/or process may be identified as malware or may behave in an untrusted manner in the virtualized environment.
  • the data and/or process may be processed in a plurality of different virtualization environments with different resources and different limitations. Those skilled in the art will appreciate that the suspicious data and/or process may or may not be further tested after the initial set of environments.
  • step 206 contextual behavioral analysis is conducted on the suspect data and suspect processes using one or more emulation environments.
  • the suspicious data acts suspiciously in one or more virtualization environments (e.g., halting execution without performing functions, storing data without using the data, and the like)
  • the data is processed in one or more emulation environments.
  • the emulation environment may be provisioned based on commonly needed resources, metadata associated with the suspicious data, and/or resources identified as needed during processing of the suspicious data within the virtualization environment.
  • the suspicious data may have direct access to memory data in the emulation environment. The behavior of the suspicious data may be monitored within the emulation environment.
  • exploits are identified and validated based on the behavior of the suspect data or suspect process in the environments.
  • the virtualization and/or emulation environments may be provisioned with various applications and operating systems in order to monitor the behavior of the suspect data or suspect process.
  • the environments may test suspect data or suspect processes against network resources and/or applications to determine vulnerabilities and malicious actions.
  • the assessment of the suspect data and/or process may extend predictive analysis to applications for a fuller or complete identification of targeted vulnerabilities.
  • the virtualization environment may be dynamically re-instantiated and re-provisioned (e.g., the process returns to step 204 with the re-instantiated and/or re-provisioned virtualization environment(s)).
  • Data from the emulation environment e.g., responses from within the emulation environment
  • a report is generated that may identify threats and vulnerabilities based on the monitored behaviors of the suspect data and the suspect processes within the testing environments.
  • the report may include a description of exploits, vulnerabilities of applications or operating systems, behaviors of the suspect data, payloads associated with the suspect data, command and control protocols, and probable targets of the suspect data (e.g., what valuable information the suspicious data was attempting to steal).
  • the report may include heuristics, additions to whitelists, additions to blacklists, statistics, or signatures designed to detect the suspect data.
  • the exemplary process 200 may be used to detect distributed attacks characteristic of advanced persistent threats.
  • a distributed attack is that an attacker may send a package to be stored in a specific location in the target computer.
  • the package and the act of storing the package may be benign.
  • the attacker may, over time, subsequently send an attack program. Without the previously stored package, the attack program may also appear benign and may not be detectable as malware by preexisting security solutions.
  • the attack program retrieves the previously stored package, however, the attack program may attack the target system (e.g., exploit a vulnerability in the operating system to take over the target computer or copy valuable data).
  • the security server 108 may first receive and test a package in at least one of the different environments.
  • a report or other characteristic of the storage e.g., the location of the stored data and the stored data
  • the security server 108 may recognize that the previously stored package was stored in that particular location of memory. The security server 108 may retrieve the previously received package and store the package within the location in memory in one of the environments and retest the attack program. If the attack program acts maliciously after receiving the package, the security server 108 may generate a report (e.g., information, signature file, heuristic, and/or the like) to identify the package as well as the attack program in order to protect against similar attacks.
  • a report e.g., information, signature file, heuristic, and/or the like
  • the security server 108 may generate a report identifying the exploited vulnerability so that the vulnerability may be corrected (e.g., the operating system patched or upgraded to correct the exploit).
  • the security server 108 may also generate a report identifying the targeted information (e.g., a password file or file of credit card numbers) so that corrective action may be taken (e.g., move the file or encrypt the information).
  • FIG. 3 is a block diagram of an exemplary security server 108 in some embodiments.
  • the security server 108 leverages both virtualization and emulation systems and methods to detect malware anti-virtualization protections and accelerate “on-demand” virtualized environments for faster prediction.
  • the security server 108 comprises a collection module 302 , a data flagging module 304 , a virtualization module 306 , an emulation module 308 , a control module 310 , a reporting module 312 , a signature module 314 , and a quarantine module 316 .
  • the collection module 302 is configured to receive network data (e.g., potentially suspicious data) from one or more sources.
  • Network data is data that is provided on a network from one digital device to another.
  • the collection module 302 may flag the network data as suspicious data based on, for example, whitelists, blacklists, heuristic analysis, statistical analysis, rules, and/or atypical behavior.
  • the sources comprise data collectors configured to receive network data. For example, firewalls, IPS, servers, routers, switches, access points and the like may, either individually or collectively, function as or include a data collector.
  • the data collector may forward network data to the collection module 302 .
  • the data collectors filter the data before providing the data to the collection module 302 .
  • the data collector may be configured to collect or intercept data that includes executables and batch files.
  • the data collector may be configured to follow configured rules. For example, if data is directed between two known and trustworthy sources (e.g., the data is communicated between two device on a whitelist), the data collector may not collect the data.
  • a rule may be configured to intercept a class of data (e.g., all documents created with MICROSOFT® Word software that may include macros or data that may comprise a script).
  • rules may be configured to target a class of attack or payload based on the type of malware attacks on the target network in the past.
  • the security server 108 may make recommendations (e.g., via the reporting module 312 ) and/or configure rules for the collection module 302 and/or the data collectors.
  • the data collectors may comprise any number of rules regarding when data is collected or what data is collected.
  • the data collectors located at various positions in the network may not perform any assessment or determination regarding whether the collected data is suspicious or trustworthy.
  • the data collector may collect all or a portion of the network data and provide the collected network data to the collection module 302 which may perform filtering.
  • the data flagging module 304 may perform one or more assessments to the collected data received by the collection module 302 and/or the data collector to determine if the intercepted network data is suspicious.
  • the data flagging module 304 may apply rules as discussed herein to determine if the collected data should be flagged as suspicious.
  • the data flagging module 304 may hash the data and/or compare the data to a whitelist to identify the data as acceptable. If the data is not associated with the whitelist, the data flagging module 304 may flag the data as suspicious.
  • collected network data may be initially identified as suspicious until determined otherwise (e.g., associated with a whitelist) or heuristics find no reason that the network data should be flagged as suspicious.
  • the data flagging module 304 may perform packet analysis to look for suspicious characteristics in the header, footer, destination IP, origin IP, payload, and the like. Those skilled in the art will appreciate that the data flagging module 304 may perform a heuristic analysis, a statistical analysis, and/or signature identification (e.g., signature-based detection involves searching for known patterns of suspicious data within the collected data's code) to determine if the collected network data is suspicious.
  • the data flagging module 304 may be resident at the data collector, at the security server 108 , partially at the data collector, partially at the security server 108 , or on a network device.
  • a router may comprise a data collector and a data flagging module 304 configured to perform one or more heuristic assessments on the collected network data. If the collected network data is determined to be suspicious, the router may direct the collected data to the security server 108 .
  • the data flagging module 304 may be updated.
  • the security server 108 may provide new entries for a whitelist, entries for a blacklist, heuristic algorithms, statistical algorithms, updated rules, and/or new signatures to assist the data flagging module 304 to determine if network data is suspicious.
  • the whitelists, entries for whitelists, blacklists, entries for blacklists, heuristic algorithms, statistical algorithms, and/or new signatures may be generated by one or more security servers 108 (e.g., via the reporting module 312 ).
  • the virtualization module 306 and emulation module 308 may analyze suspicious data for untrusted behavior (e.g., malware or distributed attacks).
  • the virtualization module 306 is configured to instantiate one or more virtualized environments to process and monitor suspicious data.
  • the suspicious data may operate as if within a target digital device.
  • the virtualization module 306 may monitor the operations of the suspicious data within the virtualization environment to determine that the suspicious data is probably trustworthy, malware, or requiring further action (e.g., further monitoring in one or more other virtualization environments and/or monitoring within one or more emulation environments).
  • the virtualization module 306 monitors modifications to a system, checks outbound calls, and checks tainted data interactions.
  • the virtualization module 306 may determine that suspicious data is malware but continue to process the suspicious data to generate a full picture of the malware, identify the vector of attack, determine the type, extent, and scope of the malware's payload, determine the target of the attack, and detect if the malware is to work with any other malware.
  • the security server 108 may extend predictive analysis to actual applications for complete validation.
  • a report may be generated (e.g., by the reporting module 312 ) describing the malware, identify vulnerabilities, generate or update signatures for the malware, generate or update heuristics or statistics for malware detection, and/or generate a report identifying the targeted information (e.g., credit card numbers, passwords, or personal information).
  • the virtualization module 306 may flag suspicious data as requiring further emulation and analytics in the back end if the data has suspicious behavior such as, but not limited to, preparing an executable that is not executed, performing functions without result, processing that suddenly terminates, loading data into memory that is not accessed or otherwise executed, scanning ports, or checking in specific potions of memory when those locations in memory may be empty.
  • the virtualization module 306 may monitor the operations performed by or for the suspicious data and perform a variety of checks to determine if the suspicious data is behaving in a suspicious manner.
  • the emulation module 308 is configured to process suspicious data in an emulated environment.
  • malware may require resources that are not available or may detect a virtualized environment. When malware requires unavailable resources, the malware may “go benign” or act in a non-harmful manner.
  • malware may detect a virtualized environment by scanning for specific files and/or memory necessary for hypervisor, kernel, or other virtualization data to execute. If malware scans portions of its environment and determines that a virtualization environment may be running, the malware may “go benign” and either terminate or perform nonthreatening functions.
  • the emulation module 308 processes data flagged as behaving suspiciously by the virtualization environment.
  • the emulation module 308 may process the suspicious data in a bare metal environment where the suspicious data may have direct memory access.
  • the behavior of the suspicious data as well as the behavior of the emulation environment may be monitored and/or logged to track the suspicious data's operations. For example, the emulation module 308 may track what resources (e.g., applications and/or operating system files) are called in processing the suspicious data.
  • the emulation module 308 records responses to the suspicious data in the emulation environment. If a divergence in the operations of the suspicious data between the virtualization environment and the emulation environment is detected, the virtualization environment may be configured to inject the response from the emulation environment. The suspicious data may receive the expected response within the virtualization environment and continue to operate as if the suspicious data was within the targeted digital device. This process is further described herein.
  • the control module 310 synchronizes the virtualization module 306 and the emulation module 308 .
  • the control module 310 synchronizes the virtualization and emulation environments.
  • the control module 310 may direct the virtualization module 306 to instantiate a plurality of different virtualization environments with different resources.
  • the control module 310 may compare the operations of different virtualization environments to each other in order to track points of divergence.
  • the control module 310 may identify suspicious data as operating in one manner when the virtualization environment includes INTERNET EXPLORER® browser v. 7.0 or v. 8.0, but operating in a different manner when interacting with INTERNET EXPLORER® browser v. 6.0 (e.g., when the suspicious data exploits a vulnerability that may be present in one version of an application but not present in another version).
  • the control module 310 may track operations in one or more virtualization environments and one or more emulation environments. For example, the control module 310 may identify when the suspicious data behaves differently in a virtualization environment in comparison with an emulation environment. Divergence and correlation analysis is when operations performed by or for suspicious data in a virtual environment is compared to operations performed by or for suspicious data in a different virtual environment or emulation environment. For example, the control module 310 may compare monitored steps of suspicious data in a virtual environment to monitored steps of the same suspicious data in an emulation environment. The functions or steps of or for the suspicious data may be similar but suddenly diverge.
  • the suspicious data may have not detected evidence of a virtual environment in the emulation environment and, unlike the virtualized environment where the suspicious data went benign, the suspicious data undertakes actions characteristic of malware (e.g., hijacks a formerly trusted data or process).
  • control module 310 may re-provision or instantiate a virtualization environment with information from the emulation environment (e.g., a page table including state information and/or response information further described herein) that may not be previously present in the original instantiation of the virtualization environment.
  • the suspicious data may then be monitored in the new virtualization environment to further detect suspicious behavior or untrusted behavior.
  • suspicious behavior of an object is behavior that may be untrusted or malicious. Untrusted behavior is behavior that indicates a significant threat.
  • control module 310 is configured to compare the operations of each virtualized environment in order to identify suspicious or untrusted behavior. For example, if the suspicious data takes different operations depending on the version of a browser or other specific resource when compared to other virtualized environments, the control module 310 may identify the suspicious data as malware. Once the control module 310 identifies the suspicious data as malware or otherwise untrusted, the control module 310 may continue to monitor the virtualized environment to determine the vector of attack of the malware, the payload of the malware, and the target (e.g., control of the digital device, password access, credit card information access, and/or ability to install a bot, keylogger, and/or rootkit). For example, the operations performed by and/or for the suspicious data may be monitored in order to further identify the malware, determine untrusted acts, and log the effect or probable effect.
  • target e.g., control of the digital device, password access, credit card information access, and/or ability to install a bot, keylogger, and/or rootkit.
  • the reporting module 312 is configured to generate reports based on the processing of the suspicious data of the virtualization module 306 and/or the emulation module 308 .
  • the reporting module 312 generates a report to identify malware, one or more vectors of attack, one or more payloads, target of valuable data, vulnerabilities, command and control protocols, and/or behaviors that are characteristics of the malware.
  • the reporting module 312 may also make recommendations to safeguard information based on the attack (e.g., move credit card information to a different digital device, require additional security such as VPN access only, or the like).
  • the reporting module 312 generates malware information that may be used to identify malware or suspicious behavior.
  • the reporting module 312 may generate malware information based on the monitored information of the virtualization environment.
  • the malware information may include a hash of the suspicious data or a characteristic of the operations of or for the suspicious data.
  • the malware information may identify a class of suspicious behavior as being one or more steps being performed by or for suspicious data at specific times. As a result, suspicious data and/or malware may be identified based on the malware information without virtualizing or emulating an entire attack.
  • the optional signature module 314 is configured to store signature files that may be used to identify malware.
  • the signature files may be generated by the reporting module 312 and/or the signature module 314 .
  • the security server 108 may generate signatures, malware information, whitelist entries, and/or blacklist entries to share with other security servers.
  • the signature module 314 may include signatures generated by other security servers or other digital devices.
  • the signature module 314 may include signatures generated from a variety of different sources including, but not limited to, other security firms, antivirus companies, and/or other third-parties.
  • the signature module 314 may provide signatures which are used to determine if network data is suspicious or is malware. For example, if network data matches the signature of known malware, then the network data may be classified as malware. If network data matches a signature that is suspicious, then the network data may be flagged as suspicious data. The malware and/or the suspicious data may be processed within a virtualization environment and/or the emulation environment as discussed herein.
  • the quarantine module 316 is configured to quarantine suspicious data and/or network data.
  • the quarantine module 316 may quarantine the suspicious data, network data, and/or any data associated with the suspicious data and/or network data.
  • the quarantine module 316 may quarantine all data from a particular digital device that has been identified as being infected or possibly infected.
  • the quarantine module 316 is configured to alert a security administrator or the like (e.g., via email, call, voicemail, or SMS text message) when malware or possible malware has been found.
  • the security server 108 allows an administrator or other personnel to log into the security server 108 .
  • the security server 108 provides a graphical user interface or other user interface that authenticates a user (e.g., via digital signature, password, username, and the like). After the user is authenticated, the security server 108 may allow the user to view the processing of the virtualization module 306 and the emulation module 306 including infection vectors, and vulnerability vectors.
  • the security server 108 may also provide the user with threshold reasoning which is further described regarding FIG. 4 .
  • FIG. 4 is a conceptual block diagram 400 of a virtualization module in some embodiments.
  • different processes 402 may be virtualized within one or more virtualization environments 404 .
  • the virtualization environments execute on a host 406 that runs over hardware 408 that is isolated from the suspicious data and/or processes.
  • the control module 310 may identify various results to identify when suspicious behavior is present (e.g., value X), in what sequence the suspicious behavior occurs (e.g., value Y) and what process (e.g., value Z).
  • a particular process 402 may be intercepted and tested in a variety of different virtualization environments 404 .
  • Each virtualization environment 404 may operate on a host 406 (e.g., operating system and/or virtual machine software) that executes over a digital device's hardware 408 .
  • the functions of the tested process may be isolated from the host 406 and hardware 408 .
  • Suspicious or untrusted behavior may be identified within the virtualization.
  • a time of exploitation may be identified as value X
  • an exploited sequence may be identified as value Y
  • a process of exploitation may be identified as value Z.
  • the X, Y, Z values may form a description of suspicious data or the process which may be used to measure the threat against a threat matrix.
  • an administrator may store a threat threshold, based on the threat matrix depending upon the level of risk that is acceptable.
  • the threat matrix may be based on interactions with the operating system, time sequence, resources, or events.
  • the degree of malicious behavior may be determined based on a threat value (e.g., comprising a function including the X, Y, and Z values).
  • the interactions with the OS, time sequences, types of interactions, and resources requested may all be elements of the threat matrix.
  • the threat value may be compared to a threat threshold to determine the degree of maliciousness and/or what actions will be taken.
  • a threat threshold may be determined and/or generated based on an administrator's acceptable level of risk.
  • Time, sequence, and process values may be generated for each tested process or data.
  • the time, sequence, and process values may be measured against the threshold using the threat matrix to determine a possible course of action (e.g., quarantine, generate a report, alert an administrator, or allow the process to continue unobstructed).
  • a possible course of action e.g., quarantine, generate a report, alert an administrator, or allow the process to continue unobstructed.
  • the X, Y, Z values may be compared to X, Y, Z values associated with the same suspicious data from the emulation environment. If the emulation environment values are different or divergent, further testing within the virtualization environment and/or the emulation environment may be required.
  • FIG. 5 is a block diagram of an exemplary virtualization module 306 in some embodiments.
  • the virtualization module 306 may comprise a virtual machine module 502 , a resource module 504 , a monitor module 506 , a taint module 508 , a time module 510 , a state module 512 , and a state database 514 .
  • the virtual machine module 502 is configured to generate one or more virtualization environments to process and monitor suspicious data.
  • virtual machines e.g., VMWARE® virtual machines or custom virtual machines.
  • the resource module 504 is configured to provision one or more virtualization environments with plug-ins or other resources.
  • plug-ins are modules built in the virtual and emulation environments that collect specific data sets from certain system components. This process may be chained to follow an execution through the system or may run in parallel if there is a threaded malicious or clean object.
  • the resource module 504 provisions a virtualization environment with an initial set of resources (e.g., operating system, OS updates, applications, and drivers). In some embodiments, the resource module 504 provisions virtualization environments to include resources based on the destination of the suspicious data (e.g., the digital device targeted to receive the suspicious data), device images provisioned by information technology management, or metadata associated with the suspicious data. In some embodiments, the resource module 504 comprises a pre-processing module that determines specific requirements based on network meta-data to determine which plug-ins should be implemented within the virtualization environment and in what combination the plug-ins may be launched.
  • resources e.g., operating system, OS updates, applications, and drivers.
  • the resource module 504 provisions virtualization environments to include resources based on the destination of the suspicious data (e.g., the digital device targeted to receive the suspicious data), device images provisioned by information technology management, or metadata associated with the suspicious data.
  • the resource module 504 comprises a pre-processing module that determines specific requirements based on network meta
  • the resource module 504 provisions a virtualization environment based on the suspicious data's similarity to malware or other suspicious data.
  • the virtualization module 306 may scan and find that the suspicious data appears to be similar to previously tested suspicious data or malware. Subsequently, the resource module 504 may provision one or more virtualization environments to include resources with known vulnerabilities to monitor whether the suspicious data acts in a similarly untrusted manner.
  • the resource module 504 provisions a virtualization environment based in part on metadata associated with the suspicious data.
  • the virtualization module 306 may receive or retrieve metadata associated with the suspicious data.
  • the resource module 504 may determine, based on the metadata, that one or more applications are required for the suspicious data to function. Subsequently, the resource module 504 may provision one or more virtualization environments with the necessary applications and related support file (e.g., operating system, shared resources, or drivers).
  • Each of the virtualized environments may have one or more different resources.
  • one virtualized environment may include INTERNET EXPLORER® browser v. 6 while another virtualized environment may include INTERNET EXPLORER® browser v. 7.
  • Different virtualized environments may include, in some embodiments, different browser programs (e.g., MOZILLA® FIREFOX® browser), different operating systems (e.g., UNIX® operating system), and/or different drivers.
  • the different virtualization environments may have similar applications or operating systems but different versions or different patches or updates. In this way, the same suspicious data may be processed using different resources. If the suspect data behaves differently with one browser than with another, then there is evidence that the suspicious data may be malware.
  • suspicious data is processed in a plurality of different virtualized environments where each of the different virtualized environments includes a limited number of differences.
  • each of the different virtualized environments includes a limited number of differences.
  • the control module 310 may provision the virtualization module 306 .
  • the control module 310 may review metadata associated with the suspicious data to determine resources to be available in one or more virtualization environments.
  • the metadata may come from a variety of sources. For example, some metadata may be apparent from the suspicious data such as a file extension or calls associated with the suspicious data.
  • the control module 310 may retrieve information regarding the suspicious data in order to provision the virtualization environment. For example, the control module 310 may determine that the suspicious data may be similar to other malware or suspicious data and provision one or more virtualized environments in a manner to see if the newly acquired suspicious data behaves in an untrusted manner.
  • the control module 310 may also provision the emulation module 308 .
  • the control module 310 may review metadata associated with the suspicious data to determine resources to be available in one or more emulation environments.
  • the control module 310 may also provision an emulation environment based on the provisioning of one or more virtualized environments. For example, the control module 310 may provision the emulation environment based on a virtualized environment where the suspicious data may have behaved abnormally (e.g., in an environment with a specific version of an operating system, the suspicious data scanned one or more areas of memory and then terminated further operations).
  • the emulation environment may, in some embodiments, share similar resources as what was provided in a virtualization environment.
  • the virtualization module 306 and/or the collection module 302 may determine resource requirements of or for the suspicious data.
  • the virtualization module 306 receives metadata associated with the suspicious data to determine resources as described herein.
  • the metadata may indicate that the network data is an executable to be run in a WINDOWS® operating system environment or the metadata may indicate that the network data is an executable file to be operated by a browser (e.g., a web application).
  • the virtualization module 306 and/or the control module 310 may dynamically select a variety of resources to provision and instantiate a virtualization environment in order to process the network data and monitor actions.
  • a resource may be missing from one, some, or all of the virtualized environments.
  • the suspicious data may require a different application to be able to execute.
  • the virtualization module 306 may halt a virtualization environment, dynamically provision the virtualization environment with the necessary resources, and re-instantiate the virtualized environment to monitor for changes in behavior of the suspicious data.
  • the monitor module 506 is configured to monitor the virtualization environments instantiated by the virtual machine module 502 . In various embodiments, the monitor module 506 logs each step or function performed by or for the suspicious data within each virtualization environment. In various embodiments, the monitor module 506 logs each operation of the suspicious data, logs changes caused by the operation (e.g., what information is stored in memory and where in memory the information is stored), and logs at what time the operation occurred.
  • the monitor module 506 may compare the operations of the suspicious data in various virtualization environments during or after virtualization. When a divergence is identified between a virtualization environment and an emulation environment or between two virtualization environments, the monitor module 506 may generate a flag or track the results to identify if different operations perform untrusted actions.
  • the taint module 508 is configured to perform taint analysis and/or other techniques to identify and track operations provided by and for the suspect data. As a result, acts associated with the suspicious data, including executions by the suspect data and executions performed by an application or operating system for the suspect data are tracked and logged. By using dynamic taint analysis, the taint module 508 and/or the monitor module 506 may monitor actions to detect whether a value that is normally derived from a trusted source is instead derived by some operation associated with the suspect data.
  • the taint module 508 may initially mark input data from untrusted sources tainted, then monitor program execution to track how the tainted attribute propagates (i.e., what other data becomes tainted) and to check when tainted data is used in dangerous ways (e.g., use of tainted data as jump addresses or format strings which may indicate an exploit of a vulnerability such as a buffer overrun or format string vulnerability).
  • the monitor module 506 may look for variable, string, particular component and feedback that causes a jump in the code.
  • the monitor module 506 and/or the taint module 508 may be plug-ins within the virtualization environment.
  • the resource module 504 may provision a monitoring plug-in and a taint analysis plug-in with one or more virtualization environments.
  • the virtualization module 306 may detect attacks at time of use in the virtualized environment as well as at the time of writing to memory. In some embodiments, the virtualization module 306 detects when a certain part of memory is illegitimately overwritten by the suspicious data at the time of writing to the memory.
  • the time module 510 provides system resources as expected by the object creating the perception of accelerated time within the virtualization and/or emulation environments. By increasing or slowing clock signals and processing, the suspicious data may be analyzed in a more detailed manner and/or in a faster time than if the clock signal was allowed to operate in real time.
  • malware requires a passage of time. For example, some malware requires seconds, minutes, days, or weeks to pass before becoming active.
  • the time module 510 may increase the clock time in the virtualization or emulation environments in order to trigger suspicious behavior.
  • time module 510 can slow clock time within the virtualization and/or emulation environments.
  • the time module 510 may take time slices to specifically identify and characterize processes that are taken by or for the suspicious data.
  • time slice information may be used to isolate an attack vector, describe the suspicious data, or determine the target of the attack.
  • time slice information may indicate that at a certain time and associated step, the suspicious data takes over a formerly trusted process. This information may be used to characterize malware such that when other suspicious data take similar action at the same time and associated step, the suspicious data may be classified as a similar type of malware.
  • the time module 510 may also segment operations by or for the object in the virtualization environment and the emulation environment to simplify comparisons of operations between the virtualization environment and the emulation environment.
  • the state module 512 tracks the various states of the virtualization environment (e.g., the time, date, process, as well as what was stored in memory where it was stored and when).
  • the virtual machine module 502 may halt a virtualization environment or instantiate a new virtualization environment utilizing the states of a previous virtualization.
  • the state module 512 may monitor the behavior of suspicious data which suspiciously terminates at time T.
  • the virtual machine module 502 may instantiate a new virtualization environment.
  • the state module 512 may perform dynamic state modification to change the new virtualization environment to include the logged states of the previous virtualization environment at time T.
  • the state module 512 and/or the time module 510 may increase the clock signal, decrease the clock signal, or simply change the clock signal depending on the processing of the suspicious data that needs to occur. As a result, the suspicious data may be allowed to execute in a similar environment at the desired time.
  • the new virtualization environment may be slightly different (e.g., include and/or not include one or more resources) from the previous virtualization environment.
  • the virtual machine module 502 does not instantiate a new virtualization environment but rather halts the previous virtualization environment and re-instantiates the previous virtualization environment at a previously logged state with one or more resources.
  • the state database 514 is a database configured to store the state of one or more virtualization environments and/or one or more emulation environments. Those skilled in the art will appreciate that the state database 514 is not limited to databases but may include any data structure.
  • control module 310 may continue to monitor the virtualized environment to determine the vector of attack of the malware, the payload of the malware, and the target (e.g., control of the digital device, password access, credit card information access, and/or ability to install a bot, keylogger, and/or rootkit). For example, the operations performed by and/or for the suspicious data may be monitored in order to further identify the malware, determine untrusted acts, and log the effect or probable effect.
  • the target e.g., control of the digital device, password access, credit card information access, and/or ability to install a bot, keylogger, and/or rootkit.
  • the virtualization module 306 may halt the virtualization environment and provide new resources. For example, if the suspicious data begins to execute a program but abruptly halts, prepares to run an executable but does not actually run the executable, or constantly checks a section in memory that should typically be empty, then the virtualization module 306 may instantiate new virtualization environments and/or re-provision existing virtualization environments with different resources to see if the suspicious data acts differently. In various embodiments, the emulation module 308 may instantiate an emulation environment to test the suspicious data.
  • the virtualization module 306 tracks different behaviors by different suspicious data in order to identify complex attacks, distributed attacks and/or advanced persistent threats (APT). For example, one type of malware may store an executable in a specific place in memory and then, possibly much later, a second type of malware may access the stored executable and attack a computerized system.
  • the virtualization module 306 may identify and record the behavior of suspicious data which, when executed in a virtualization environment, only stores an executable in a specific place in memory but performs no other functions. If other data is executed in the virtualization environment which checks that specific place in memory, the virtualization module 306 may halt the virtualization, provision the executable from the previous data in the specific location in memory, and re-run the virtualization environment to monitor changes.
  • FIG. 6 is an exemplary virtualization environment 600 for detection of malware in some embodiments.
  • the virtualization environment 600 comprises objects 602 , a network 604 , applications 606 , operating system 608 , a virtual machine 610 , a hypervisor 612 , a manager 614 , a dynamic state manager 616 , and a page table manager 618 .
  • Objects include, but are not limited to, suspicious data and/or processes that are tested in the virtualization environment 600 .
  • the network 604 comprises resources to allow the objects 602 to function and/or operate with access to network resources (e.g., network drivers and ports).
  • the applications 606 include one or more applications or other resources that function with the objects 602 to operate in the virtualization.
  • the applications may include word processing applications, web browsers, applets, scripting engines, and the like.
  • Different virtualization environments may include different applications and/or different versions.
  • one virtualization environment may comprise INTERNET EXPLORER® browser v. 9 while another virtualization environment may comprise MOZILLA® FIREFOX® browser v. 5.0.
  • one virtualization environment may comprise INTERNET EXPLORER® browser v. 9 while three other virtualization environments may comprise v. 8, v. 7, and v. 6 of the INTERNET EXPLORER® browser, respectively.
  • the operating system 608 includes all or part of the operating system necessary for the objects 602 to function within the virtualization.
  • the operating system may include, for example, the UBUNTU® LINUX®, WINDOWS XP®, or OS X MOUNTAIN LIONTM operating systems.
  • Different virtualization environments may include different operating systems 608 , and/or include different versions of operating systems 608 (e.g., WINDOWS XP® and WINDOWS® 7.0 operating systems). Further, different virtualization environments may include different applied patches and upgrades.
  • the virtual machine 610 may include any number of virtual machines configured to generate one or more virtualization environments to process the objects 602 .
  • the hypervisor 612 kernel, or virtual machine manager, manages resources for the virtualizations and may allow multiple operating systems (e.g., guests) to run concurrently on the host computer.
  • the hypervisor may manage execution of the guest operating systems.
  • the manager 614 is configured to manage monitoring and control the virtualization environment 600 .
  • the control module 310 controls the virtualization environment 600 , including the provisioning, time acceleration, and logging through the manager 614 .
  • the dynamic state manager 616 tracks and logs the state of the machine.
  • the DSM may also store the state for later use within the same or different virtualization environments (e.g., for dynamic state modification).
  • the state may include, for example, the object or object identifier, resources available, time slices when events occurred, and logged events.
  • the DSM 616 may also comprise contents in memory, and locations of contents in memory over time.
  • the page table manager 618 may receive one or more page tables from the emulation environment.
  • the object may be tested within both the virtualization environment and the emulation environment.
  • the emulation module 308 may log the state of the emulation environment and pass the state information to the virtualization environment 600 as a page table for dynamic state modification of the virtualization environment.
  • the virtualization module 306 re-instantiates the original virtualization environment (e.g., instantiates a modified image of the virtualization environment) and dynamically modifies the state of the virtualization environment using the page table(s) from the emulation environment or the virtualization module 306 may instantiate a new virtualization environment and load the information from the page table.
  • the original virtualization environment e.g., instantiates a modified image of the virtualization environment
  • the virtualization module 306 may instantiate a new virtualization environment and load the information from the page table.
  • FIG. 7 is a flow diagram of an exemplary malware detection method.
  • an object is intercepted by a data collector.
  • the data collector may be placed on any digital device and/or network device.
  • the resource module 504 inspects what resources the object may require for processing (e.g., dynamic libraries and/or registries the object may affect).
  • the collector includes metadata including where the object came from, where the object was to be received, and/or what application created the request.
  • the resource module 504 may perform preprocessing by determining what resources are required based on the metadata.
  • the virtual machine module 502 instantiates a first instance of a virtualization environment with one or more resources identified by the resource module 504 .
  • the virtual machine module 502 selects and initiates plug-ins within the virtualization environment for memory allocation, forensics, mutex, filesystem, monitoring, taint analysis, and the like.
  • the object is executed and/or processed within the virtualization environment.
  • the taint module 508 taints operations of the object within the virtualization environment.
  • the taint module 508 may be a plug-in.
  • the taint module 508 taints the object, bit by bit, with trace capture information.
  • the monitor module 506 monitors the operations assessing what resources were previously allocated and what resources are actually allocated and called within the virtualization environment.
  • Resources that are required and/or called by the object which were not initially provisioned may be assessed as further evidence of malware.
  • sets of newly requested resources may be assessed to determine the likelihood of malware. For example, a particular set of resources may be determined to be malicious. If an object calls that particular set of resources (e.g., by calling resources that have not been initially provisioned, calling resources that were initially provisioned, or calling a combination of resources of which only a few were initially provisioned), the object may be determined to be malicious.
  • the monitor module 506 may identify untrusted actions from monitored operations.
  • the monitor module 506 may be a plug-in.
  • the virtual machine module 502 may load only those resources called by the resource module 504 within the virtualization environment. If the object calls a driver that is not originally provided in the virtualization environment (e.g., the object went outside of the original boundaries or the initially accepted criteria), the object's operations may terminate.
  • the virtualization environment is re-instantiated or a new virtualization environment may be instantiated that includes the additionally called resource to further process and monitor the operations of the object.
  • the object runs in a plurality of virtualization environments until all operations called on by or for the object are completed.
  • the control module 310 may compare the operations performed by or for the object in one virtualization to actions performed in another virtualization to analyze for divergence. If the actions taken were similar between the two virtualization environments, then no divergence was found. If the actions taken were different, divergence is found and the differences may be further assessed (e.g., found untrusted actions taken when an unpatched operating system was present).
  • Divergence may be evidence of malware.
  • the object ceases to perform any operations at time T in one virtualization environment but continues to perform many additional operations after time T in another virtualization environment (e.g., use of different resources, point to different points in memory, open a socket, or open up output ports)
  • the difference in the environment e.g., an available exploit
  • the object likely influenced the actions of the object and, as such, vulnerabilities may be identified.
  • the operations taken for or by the object within the virtualization environment may be measured to determine a threat value.
  • the threat value may be compared to a customizable threshold to determine if the behavior of the object is untrustworthy.
  • the threat value is determined based on X values and Y values.
  • the X values may include those operations taken by a plug-in while the Y value correlates to the plug-in and the virtualization environment (e.g., operating system, kernel, or hypervisor). These two values may be part of a function to determine the threat value of each operation by or for the object, an entire execution path of the object, or a part of the execution path of the object.
  • operations taken by or for an object may be weighted based on a matrix of actions regarding an operation system, application, network environment, or object.
  • the threat value may be compared to a threat threshold to determine if the effect of the object within the virtualization environment is sufficiently trustworthy or if the object is behaving in a manner that is sufficiently suspicious to warrant running the object through the emulation environment. Further, the threat value may be compared to the threat threshold to determine that the operations are such that they may be characterized as untrusted and, therefore, the object may be quarantined and further corrective action may be taken.
  • the threat value associated with one or more objects may be increased (e.g., determined to be more threatening and, therefore, indicative of an increased likelihood of maliciousness) based on the resources called by the object.
  • a particular set of resources may be determined to be malicious. If an object calls that particular set of resources, a threat value associated with the object may signify a significantly increased likelihood of maliciousness.
  • the reporting module 312 generates a report identifying operations and untrusted actions of the object.
  • the reporting module 312 may generate a report identifying the object, the payload, the vulnerability, the object of the attack, recommendations for future security, and so on.
  • suspicious data may be provided to the virtualization environment. If the suspicious data behaves in a manner similar to known malware, a class of malware, or a class of data with suspicious behavior, then the object may be quarantined and remedial action taken (e.g., the user of the target digital device may be notified). In some embodiments, the process of testing the suspicious data within a virtualization environment to determine a potential threat may be faster than utilizing signatures in the prior art.
  • FIG. 8 is a flow diagram of an exemplary method of controlling a virtualization environment to detect malware.
  • the state module 512 may log a first instance of the virtualization environment.
  • the state module 512 may log or track the state of the virtualization environment (e.g., time, memory values, location of data within memory, and/or ports called).
  • the state module 512 may log the state of a plurality of virtualization environments operating in parallel.
  • the virtual machine module 502 may halt the first instance of the virtualization environment.
  • the object may have terminated functions after requesting a resource not originally provided in the first instance of the virtualization environment.
  • the request for a resource not originally provisioned is evidence of malware (e.g., requesting access to a resource that the object should not have reason to access).
  • the virtual machine module 502 may permit the first instance of the virtualization environment to continue running and the virtual machine module 502 may instantiate a new instance of the virtualization environment.
  • the resource module 504 determines additional resources for the object. For example, if the object requests a resource not originally provided in the first instance of the virtualization environment, the resource module 504 may identify the desired additional resource. In various embodiments, if a divergence is also detected with another virtualization environment, the resource module 504 may also identify differences in resources between the first and other virtualization environments.
  • step 808 the virtual machine module 502 re-instantiates the first instance of the virtualization environment including the previously identified resources at the previously logged state. As a result, the object may be presented with an environment that may appear to be unprotected. Further, in step 810 , the time module 510 may accelerate the clock signal to the time the object requested the unavailable resource.
  • the monitor module 506 may monitor operations by or for the object within the re-instantiated virtualization environment. In some embodiments, the monitor module 506 monitors the operations by or for the object as if the virtualization environment had not changed. In some embodiments, a plug-in monitors the operations by or for the object and provides information to the monitor module 506 . In step 814 , the monitor module 506 may identify untrusted actions from monitored operations. As discussed herein, the operations, either taken alone or in combination, may be used to determine a threat value. The threat value may be compared to a threat threshold to determine if the object is behaving suspiciously, not behaving suspiciously, or behaving in an untrustworthy manner.
  • the reporting module 312 may generate a report identifying suspicious or untrusted operations as well as any untrusted actions (e.g., vulnerability exploits, target of payload, defenses of the object and so on).
  • the first instance of the virtualization environment may not be halted.
  • a new instance of the virtualization environment is instantiated (without halting the previous instance) including the state information and the like.
  • the first instance of the virtualization environment is halted and then re-instantiated including the state information.
  • FIG. 9 is a flow diagram of an exemplary model to detect malware through multiple virtualization environments.
  • the collection module 302 collects the object and the resource module 504 determines one or more required resources.
  • the virtual machine module 502 may instantiate the first instance of the virtualization environment with the determined resources. Further, in step 906 , the virtual machine module 502 may instantiate a second instance of the virtualization environment but with resources that are different from that provided in the first instance of the virtualization environment. For example, versions of applications may be different, operating system patches, may be different, or the like.
  • the virtual machine module 502 executes the object within the first and second instances of the virtualization environment.
  • the monitor module 506 may monitor operations of the object within the first and second virtualization environments.
  • the monitor module 506 traces the operations of the object in both virtualization environments.
  • a trace may be based on X values (e.g., operations by or on a plug-in of the virtualization environment) and Y values (e.g., operations between an operating system of the plug-in which may be coordinated with the X values). In some embodiments, not all operations are relevant.
  • one or more actions or operations by the host during processing may be compared against a check system to determine if the action or operation is relevant.
  • the action or operation may be given weight and may affect the trace.
  • the one or more actions or operations by the host during processing may be compared against a check system to determine if the action or operation is not relevant. If the action or operation is not relevant, then the action or operation may be given no weight and may not affect the trace.
  • step 912 the control module 310 or the monitor module 506 compares the operations of the first instance and the operations of the second instance to determine divergence.
  • the traces of the object in the respective virtualization environments may form an execution tree which may be compared to other execution trees associated with other virtualization environments.
  • divergence between the traces of the two virtualization environment may be found.
  • the control module 310 may halt one or both of the virtualization environments and may notify an administrator of malware.
  • the control module 310 continues processing the object within one or both virtualization environments to further identify characteristics of the suspicious data, targeted vulnerabilities, payload, goal, or the like.
  • the reporting module 312 generates a report identifying operations suspicious behavior, and/or untrusted actions of the object based, in part, on the comparison. For example, the reporting module 312 may identify the exploit that is present in some digital devices but not others. Further, the report may include recommendations to improve security (e.g., moving valuable information to a more secure location).
  • FIG. 10 is a block diagram of an exemplary digital device 1000 .
  • the digital device 1000 comprises a processor 1002 , a memory system 1004 , a storage system 1006 , a communication network interface 1008 , an I/O interface 1010 , and a display interface 1012 communicatively coupled to a bus 1014 .
  • the processor 1002 is configured to execute executable instructions (e.g., programs).
  • the processor 1002 comprises circuitry or any processor capable of processing the executable instructions.
  • the memory system 1004 is any memory configured to store data. Some examples of the memory system 1004 are storage devices, such as RAM or ROM. The memory system 1004 can comprise the RAM cache. In various embodiments, data is stored within the memory system 1004 . The data within the memory system 1004 may be cleared or ultimately transferred to the storage system 1006 .
  • the storage system 1006 is any storage configured to retrieve and store data. Some examples of the storage system 1006 are flash drives, hard drives, optical drives, and/or magnetic tape.
  • the digital device 1000 includes a memory system 1004 in the form of RAM and a storage system 1006 in the form of flash data. Both the memory system 1004 and the storage system 1006 comprise computer readable media which may store instructions or programs that are executable by a computer processor including the processor 1002 .
  • the communication network interface (com. network interface) 1008 can be coupled to a network (e.g., communication network 114 ) via the link 1016 .
  • the communication network interface 1008 may support communication over an Ethernet connection, a serial connection, a parallel connection, or an ATA connection, for example.
  • the communication network interface 1008 may also support wireless communication (e.g., communication using the 802.11 a/b/g/n standard or the WIMAX® standard). It will be apparent to those skilled in the art that the communication network interface 1008 can support many wired and wireless standards.
  • the optional input/output (I/O) interface 1010 is any device that receives input from the user and output data.
  • the optional display interface 1012 is any device that is configured to output graphics and data to a display.
  • the display interface 1012 is a graphics adapter. It will be appreciated that not all digital devices 1000 comprise either the I/O interface 1010 or the display interface 1012 .
  • a digital device 1000 may comprise more or fewer hardware elements than those depicted. Further, hardware elements may share functionality and still be within various embodiments described herein.
  • encoding and/or decoding may be performed by the processor 1002 and/or a co-processor located on a GPU (e.g., an NVIDIA® GPU).
  • FIG. 11 is a conceptual block diagram of an emulation environment 1100 in some embodiments.
  • the emulation environment 1100 may be instrumented and allow the object direct access to memory.
  • malware that searches for evidence of virtualization or evidence of a security program may conclude that a target machine is sufficiently unprotected and, as such, may engage malicious behavior without early termination.
  • the emulation environment 1100 comprises the process 1102 being tested, the hypervisor 1104 , the host 1106 , and the hardware 1108 .
  • the process 1102 may comprise the functions of or for an object received from the virtualization module 306 .
  • the hypervisor 1104 may provision the emulation environment 1100 and synchronize operations between one or more virtualization environments and the emulation environment 1100 .
  • the hypervisor 1104 initially provisions the emulation environment based on metadata associated with the data to be assessed and/or resources identified within the virtualization environment(s).
  • the hypervisor 1104 may be an emulation manager configured to control the emulation environment. In one example, the hypervisor 1104 may redirect commands between the process 1102 , host 1106 and/or hardware 1108 . In various embodiments, the hypervisor 1104 may receive trace information from a trace capture plug-in within the emulation environment 1100 to trace behavior of the object (e.g., commands from and/or responses to the object in the emulation environment 1100 ). In various embodiments, the hypervisor 1104 is a kernel.
  • the host 1106 comprises the host system (e.g., operating system), support applications, and other data at the O/S layer.
  • the hardware 1108 includes the drivers and hardware interfaces at the hardware layer.
  • the hypervisor 1104 determines trace values to compare against the trace values of the virtualization environment. Since the emulation environment is not a virtualization environment, the object may behave in a different manner and, as such, the trace values between the emulation environment and the virtualization environment may be different.
  • a control module 310 may perform divergence analysis by comparing the trace values from the virtualization module 306 and the emulation module 308 . If the values are different, the control module 310 may halt the virtualization environment to control the virtualization and include one or more responses recorded in the emulation environment which is further discussed herein.
  • the emulation module 308 may generate trace values to compare with tracing of the object in the virtualization environment to detect divergence.
  • the events associated with the object may be evaluated based on a time value X′, a sequence value Y′ and a process value Z′. These values may be compared to values of the virtualization environment to identify divergence. Divergence detection is further discussed herein.
  • FIG. 12 is a block diagram of an exemplary emulation module 308 in some embodiments.
  • the emulation module 308 comprises an emulation engine 1202 , a plug-in module 1204 , a trace capture module 1206 , a recording module 1208 , a manager module 1210 , a time module 1214 , and a hierarchical reasoning engine (HRE) 1216 .
  • HRE hierarchical reasoning engine
  • the emulation module 308 implements an emulation environment and may be instrumented (e.g., via plug-ins that operate with and/or within the emulation environment).
  • the emulation module 308 may allow an object direct memory access.
  • the emulation module 308 may instantiate any number of emulation environments. In one example, the emulation module 308 operates three different emulation environments in parallel.
  • the plug-in module 1204 is configured to load one or more plug-ins into the emulation environment to process the object.
  • the plug-ins may include application-layer information (e.g., Adobe, shared drivers, and mutex), network information (e.g., available port), and the like.
  • the plug-in module 1204 does not comprise plug-ins for security or tracking operations which may be detected by the object.
  • the resource module 504 of the virtualization module 306 provides a list of required resources and/or metadata to the plug-in module 1204 .
  • the plug-in module 1204 may provision the emulation environment based in part on the information received from the resource module 504 , the object, and/or metadata associated with the object.
  • the resource module 504 is a hypervisor.
  • the trace capture module 1206 is configured to track an execution path for the object.
  • the trace capture module 1206 may trace actions taken for and by the object in the emulation module 308 .
  • the trace capture module 1206 may be within the hypervisor layer, be a plug-in, or be a combination of both. As a result, the trace capture module 1206 and/or the functions of the trace capture module 1206 may be invisible to the object.
  • the trace capture module 1206 traces the operations of and for the object in the emulation environment.
  • the trace capture module 1206 may generate a trace for the object in the emulation environment based on actions of the plug-ins (e.g., an X trace) and actions taken that correlate between the emulation environment and the plug-ins (e.g., a Y trace).
  • This trace capture process may be similarly taken in the virtualization environment where the X trace may be associated with actions of the plug-ins of the virtualization environment and the Y trace may be associated with actions that correlate between the virtualization environment and the plug-ins.
  • the manager module 1210 may compare the trace for the object in the emulation environment to a trace for the object in the virtualization environment to detect divergence.
  • the trace for the object in the emulation environment and/or virtualization environment may be filtered such that all actions taken in the emulation environment and/or virtualization environment are not necessary to generate the trace. For example, all of the actions taken by the host system during processing of the object may not be relevant to trace.
  • the trace capture module 1206 may generate a trace based on relevant actions or operations.
  • the trace capture module 1206 filters the actions and operations of the host and/or one or more plug-ins during processing of the object.
  • one or more actions or operations by the host during processing may be compared against a check system to determine if the action or operation is relevant.
  • the action or operation may be given weight and may affect the trace. If the action or operation is not relevant, then the action or operation may not be considered when developing the trace. Those skilled in the art will appreciate that similar filtering may occur in determining the trace in the virtualization environment.
  • the recording module 1208 may record operations by or for the object in the emulation environment. In some embodiments, the recording module 1208 records responses to the object within the emulation environment (e.g., responses from the host). The recording module 1208 may also track the state of the emulation environment (e.g., what is stored in memory, where in memory is data stored, and/or time of operation(s)). Those skilled in the art will appreciate that the recording module 1208 may record any kind of information including information from the object, information for the object, or information generated on behalf of the object.
  • the time module 1214 may record time of events, record time states of actions or operations in the emulation environment, and may accelerate time (e.g., the clock signal) within the emulation environment to detect changes in the behavior in the object. For example, some malware is configured to wait a predetermined period of time before acting maliciously. In some embodiments, the time module 1214 may accelerate one or more clock signals in the emulation environment such that the object is given a period of time to trigger untrusted behavior.
  • time e.g., the clock signal
  • the virtualization module 306 may re-instantiate the virtualization environment.
  • the state of the emulation environment including the resources, data in memory, locations of data in memory, clock signal, and/or the like may be loaded into the virtualization environment upon re-instantiation.
  • the virtualization environment may begin to process the object at the time (or the time preceding) the divergence between the virtualization environment and the emulation environment.
  • the object may be given at least part of the recorded information from the emulation environment.
  • the recorded information may include all or part of a response to the object at the time divergence was detected.
  • the object may receive the response and act within the virtualization environment as if the object had been received by the target system and a proper response was received. Subsequently, the virtualization module 306 may continue to trace the behavior of the object within the virtualization environment. The new trace may also be compared to the trace of the emulation environment to determine if a divergence is found. If there is no divergence, the virtualization environment may continue to process the suspicious data to look for untrusted behavior.
  • the virtualization module 306 and the emulation module 308 may operate the virtualization environment and emulation environment in parallel. For example, after an object is identified as behaving suspiciously, then an emulation environment may be instantiated for processing the object. When a divergence between the virtualization environment and the emulation environment is detected, the virtualization environment may be re-instantiated (e.g., the instance of the virtualization environment may be restarted or a new virtualization environment may be instantiated) with some of the recorded information from the emulation environment. The virtualization environment and emulation environment may continue to process the object in parallel.
  • the emulation environment and/or the virtualization environment may halt and the virtualization environment re-instantiated with new recorded information and/or new state information from the emulation environment. The process may continue until the processing of the object is completed.
  • the emulation environment may be halted or terminated until or unless the object is determined to be behaving in an untrusted manner.
  • the HRE 1216 may be configured to determine maliciousness or provide information that may increase a threat value associated with the object being processed. In some embodiments, the HRE 1216 assesses the behavior of the data being processed. In one example, the HRE 1216 assesses requests for resources that deviate from the initially identified resources, assesses deviations between the virtualization environment and emulation environment, and assesses the significance of a series of actions. If combinations of resources calls and/or actions have been identified as malicious, then the HRE 1216 may flag the data as malicious or increase one or more threat values.
  • sets of resources, sets of requested resources not initially provisioned, and/or sets of actions performed by a suspicious object may be associated with malicious behavior.
  • sets of resource and/or actions may be identified as malicious or have an increased likelihood of being malicious.
  • the HRE 1216 may review the activities of an object within the virtualization environment and/or emulation environment to determine if a set of requested resources and/or actions are similar to known malicious sets of resources and/or actions. If the HRE 1216 identifies a set of requested resources and/or actions as being malicious, the HRE 1216 may provide threat information that maybe heavily weighted in determining the risk of the object (e.g., reduce a value of trustworthiness associated with the object).
  • FIG. 13 is a flow diagram of an exemplary malware detection method utilizing an emulation environment in some embodiments.
  • an object is intercepted by a data collector.
  • the data collector, an agent associated with the data collector, and/or a security server 108 may test the object to determine if the object is suspicious. For example, the security server 108 may compare a hash of the object to a whitelist, compare a hash of the object to a blacklist, apply heuristic analysis, or apply statistical analysis. The security server 108 may also apply one or more rules to determine if the object is suspicious.
  • the security server 108 may have rules that flag an object as suspicious if the object came from an untrusted source, was being sent to an critical destination (e.g., the digital device that is to receive the object contains credit card numbers, health records, trade secret information, or other sensitive information), or was otherwise sent in an atypical manner (e.g., the sending digital device does not normally send objects to the destination digital device).
  • an critical destination e.g., the digital device that is to receive the object contains credit card numbers, health records, trade secret information, or other sensitive information
  • was otherwise sent in an atypical manner e.g., the sending digital device does not normally send objects to the destination digital device.
  • the virtual machine module 502 instantiates a first instance of a virtualization environment with the resources identified by the resource module 504 .
  • the virtual machine module 502 sets up plug-ins within the virtualization environment.
  • the resource module 504 inspects what resources the object needs (e.g., dynamic libraries and/or registries the object may affect).
  • the object is executed and/or processed within the virtualization environment.
  • the taint module 508 taints operations of the object within the virtualization environment.
  • the taint module 508 taints the object, bit by bit, with trace capture information.
  • the virtual machine module 502 traces operations of the object during processing within the virtualization environment.
  • operations of, for, or provided as a response to the object may be used to generate one or more traces associated with the object in the virtualization environment.
  • the virtual machine module 502 may generate one or more traces based on actions of the plug-ins (e.g., the X trace) and actions taken that correlate between the virtualization environment and the plug-ins (e.g., the Y trace).
  • the actions identified for the trace generated by the virtual machine module 502 may also be filtered in a manner as described regarding filtering actions associated with the emulation environment.
  • the virtualization module 306 detects suspicious behavior of the object. For example, an object may be flagged as having suspicious behavior if the object executes a number of tasks and then abruptly terminates, executes one or more tasks that appear to have no relevant effect, checks a location in memory which should be empty, scans memory locations for no apparent purpose, hashes communication between the object and the host (e.g., to compare with a predetermined hash to identify a pattern of communication that is consistent with virtualization or security program interference), or the like.
  • the emulation module 308 instantiates an emulation environment.
  • the control module 310 determines when behavior of the object is suspicious and controls the instantiation of the emulation environment.
  • One or more actions for or by the object within the virtualization environment may be used to determine a trustworthiness value.
  • actions taken by or for an object may be weighted based on a matrix of actions regarding an operation system, application, network environment, or object.
  • a trustworthiness value associated with one or more actions may be compared to a trustworthiness threshold to determine if the effect of the object within the virtualization environment is sufficiently trustworthy or if the object is behaving sufficiently suspicious to warrant running the object through the emulation environment.
  • a user e.g., administrator or security personnel
  • the emulation module 308 instantiates the emulation environment.
  • the plug-in module 1204 instantiates the emulation environment with a standard set of plug-ins and/or other resources.
  • the plug-in module 1204 receives resource information from the resource module 504 , the object, and/or metadata associated with the object.
  • the plug-in module 1204 may configure the emulation environment (e.g., include one or more resources within the emulation environment) based, at least in part, on the resource information.
  • the emulation module 308 processes the object within the emulation environment.
  • the trace capture module 1206 traces operations for or by the object during processing within the emulation environment.
  • the trace of the object within the emulation environment may be based on actions of the plug-ins and/or actions that correlate between the host, plug-ins and/or object.
  • the trace of the object within the emulation environment and the trace of the object within the virtualization environment may be performed in any way.
  • the trace is generated differently (e.g., based on different action and/or different filtering) between the emulation environment and the virtualization environment.
  • the recording module 1208 of the emulation module 308 may record information within the emulation environment.
  • the recording module 1208 may record any operation, resource, or operation of the emulation environment.
  • the recording module 1208 records responses provided from the plug-ins, the host, and/or the hardware to the object.
  • the manager module 1210 may compare one or more traces of the virtualization environment to one or more traces of the emulation environment to detect divergence.
  • the object may scan memory within the virtualization environment and identify file and/or remnants of running the virtualization. As a result, the object may “go benign” and not perform any malicious behavior in the virtualization environment.
  • a copy of the object within the emulation environment may scan the memory and not find any files or remnants of a virtualization and, as a result, execute a malicious payload.
  • the manager module 1210 may identify divergence between the two environments.
  • the virtualization environment when divergence is detected, the virtualization environment is re-instantiated or a new virtualization environment is instantiated at the point of divergence.
  • the time of divergence may be identified and the logged state of the emulation environment at that time of divergence may be provisioned within the virtualization environment.
  • the virtualization module 306 may store similar data in memory at memory locations that are similar to the data stored in memory of the emulation environment at the time of divergence.
  • the clock signal within the virtualization environment may be accelerated such that the relevant conditions at the point of divergence may be similar between the two environments.
  • the object within the virtualization environment may be presented with the recorded response from the emulation environment, and the operations of the object may continue to be monitored in the virtualization environment.
  • the virtualization module 306 re-instantiates the virtualization environment with recorded information from the emulation environment. In various embodiments, the virtualization module 306 re-instantiates the virtualization environment by halting processing within the virtualization environment and restarting the virtualization environment. In some embodiments, the virtualization module 306 re-instantiates the virtualization environment by instantiating a new virtualization environment.
  • the newly instantiated virtualization environment may be loaded with one or more states and information from the emulation environment.
  • the emulation module 308 provides a page table (or information which may populate a page table) reflecting state and/or information that reflect operations within the environment.
  • the virtualization module 306 may instantiate the virtualization environment with all or some of the page table information.
  • the virtualization module 306 may also provide all or some of the recorded information from the emulation environment to the object in the newly instantiated virtualization environment.
  • the virtualization module 306 may construct the newly instantiated virtualization environment to include all resources, data, memory, clock signals, and activities up to the point of divergence such that it may appear to the object that the object has been processed from the beginning and the object has received information (e.g., the recorded information) needed to proceed to a next step.
  • the recorded information and state of the newly instantiated virtualization environment allows the object to continue functioning (e.g., the object executes a malicious payload or the object continues examination looking for security programs or virtualization before executing the payload).
  • the virtualization module 306 continues to monitor the operations by and for the object to identify suspicious behavior in the re-instantiated virtualization environment. Similar, to step 1310 , the virtualization module 306 may detect suspicious behavior in any number of ways. If the object continues to behave suspiciously, the control module 310 may instantiate a new emulation environment and may optionally load resources or plug-ins based on information from the resource module 504 . Alternatively, the emulation module 308 may continue monitoring with the existing emulation environment. In some embodiments, the object continues processing in the emulation environment regardless if the object behaves suspiciously in the newly re-instantiated virtualization environment.
  • the virtualization module 306 monitors the operations by and for the object to identify untrusted actions in the re-instantiated virtualization environment.
  • one or more actions of or by the object may be characterized as untrusted based on a trustworthiness value.
  • the trustworthiness value may be compared to a trustworthiness threshold to determine if actions taken by or for the object are untrustworthy (e.g., the object is considered to be malware).
  • actions taken by or for an object may be weighted based on a matrix of actions regarding an operation system, application, network environment, or object.
  • a user e.g., administrator or security personnel
  • the reporting module 312 In step 1326 , the reporting module 312 generates a report describing the object and untrusted actions as described herein.
  • the report may be similar to the report generated regarding step 716 in FIG. 7 .
  • FIG. 14 is an exemplary emulation environment 1400 for detection of malware in some embodiments.
  • the emulation environment 1400 may comprise a first domain 1402 running the LINUX® operating system, as well as first domains 1404 and 1406 running the WINDOWS® operating system. Further, the emulation environment 1400 may comprise a standard OS 1408 , a taint analysis & data flow tracking 1410 , direct memory access 1412 , OS forensics plug-ins 1414 , dynamic state modification & data flow captures 1416 , divergence analysis module 1418 , hypervisor component 1420 , malware analysis virtual machine manager 1422 , and a processor emulator 1424 .
  • the domains 1402 - 1406 are the domains of the emulation environment 1400 .
  • the native domain may be domain 1402 running the LINUX® operating system, while domains 1404 and 1406 emulate the WINDOWS® operating system. Any domain may be native and any domain may be emulated.
  • the standard OS 1408 may be any OS (e.g., the LINUX® operating system) which may pass information to the other components of the emulation environment 1400 .
  • the standard OS 1408 may be any operating system.
  • the taint analysis and data flow tracking function 1410 may monitor and perform taint analysis to determine indications of maliciousness. In some embodiments, the taint analysis and data flow tracking function 1410 may receive information regarding tracking functions and tainting from a plug-in (e.g., from the OS forensics plug-ins function 1414 ).
  • the dynamic state modification & data flow captures 1416 may determine what resources are needed by an object in the emulation environment 1400 .
  • the dynamic state modification & data flow captures 1416 may identify additional resources required by the object, increase or decrease time value within the emulation environment 1400 , and/or monitor behavior of the object.
  • the object may request resources that were not originally provisioned by the hypervisor component 1420 .
  • the hypervisor component 1420 may provision a new emulation environment 1400 , adjust the emulation environment 1400 with the new resources, and/or synchronize requested resources with one or more other virtualization environment(s) and/or emulation environment(s).
  • the divergence analysis module 1418 may track and compare operations of an object to determine divergence as discussed herein. In various embodiments, the divergence analysis module 1418 may trace operations as depicted in FIGS. 15 and 16 herein.
  • the hypervisor component 1420 may be configured to synchronize between multiple virtualization environment(s) and emulation environment(s). For example, the hypervisor component 1420 may control initial provisioning within the emulation environment based on resources requested in the virtualization environment, resources originally provisioned within the virtualization environment, resources requested in another emulation environment, or metadata associated with the object to be tested. The hypervisor component 1420 may also provide resources to be provisioned to one or more virtualization environment(s) and compare the operations between or among any number of environments (both virtualization environments and emulation environments).
  • the malware analysis virtual machine manager 1422 may receive information from the direct memory access 1412 and control the OS forensics plug-ins 1414 .
  • the OS forensics plug-ins 1414 may provide information to the dynamic state modification & data flow captures function 1416 , the divergence analysis module 1418 , and/or the hypervisor component 1420 .
  • the malware analysis virtual machine manager 1422 may also select, initiate, control, and/or deactivate plug-ins.
  • the processor emulator 1424 is any processor and/or emulator to assist in the emulation process.
  • the processor emulator 1424 may implement generic machine emulator and virtualizer.
  • FIG. 15 is a trace diagram 1500 of operations by or for an object in an emulation environment in some embodiments.
  • the trace capture module 1206 generates a trace of the behavior of the object in the emulation environment.
  • the trace of the object in the emulation environment may be compared to a trace of the object in the virtualization environment to detect divergence.
  • the trace of the object in the virtualization environment may be generated based on different factors or in a different way than the trace generated for the object in the emulation environment.
  • actions taken for or by the object in the emulation environment may travel execution trees.
  • the object may be a file that is executed within the INTERNET EXPLORER® browser.
  • the object may spawn a mail process and an Active X process.
  • nodes and branches generated on the execution tree may vary based on context.
  • the trace capture module 1206 captures execution paths of the execution trees.
  • the trace capture module 1206 may capture time and events (e.g., operations by or for the object) to determine an execution path.
  • the execution path may represent functions of the plug-ins of the emulation environment.
  • the control module 310 may correlate the execution paths of the execution trees between the emulation environment and the virtualization environment for divergence. Further, the execution paths of the execution trees of the virtualization environment and the emulation environment may be mapped against malicious behaviors and threat thresholds to determine the degree of untrustworthiness. For example, one or more operations of or by the object in the virtualization environment and/or the emulation environment may be measured against a predetermined threshold (e.g., based on a frame of reference identifying a degree of risk along different dimensions such as object operations, application operations, operating system operations, and network operations).
  • a predetermined threshold e.g., based on a frame of reference identifying a degree of risk along different dimensions such as object operations, application operations, operating system operations, and network operations.
  • threshold and trustworthiness valuation procedure may be customized. Certain risks, depending on the nature of the network, the state of critical information (e.g., encrypted), and the like may influence how an administrator may characterize the threshold and valuation procedure.
  • FIG. 16 is a block diagram 1600 of divergence detection between a virtualization environment and an emulation environment in some embodiments. Traces may be discrete based on events (e.g., operations) and/or time. Those skilled in the art will appreciate that in the various execution trees depicted in FIG. 16 , the execution tree on the left side of the graphs represents an execution tree in the virtualization environment while the execution tree on the right of the graphs represents an execution tree in the emulation environment. Although the term “iteration” is used within FIG. 16 , the different graphs 1602 - 1610 may be understood to be based on time, events, or paths.
  • the instantiation of the execution tree at time 0 depicts an initial operation.
  • the initial operation of the execution tree may be an instantiation of an application (e.g., INTERNET EXPLORER® browser v. 5.0) or resource call by or for the object.
  • an application e.g., INTERNET EXPLORER® browser v. 5.0
  • resource call by or for the object.
  • the spot in graph 1602 is identified with a threat value to indicate whether the operation is trustworthy or untrustworthy.
  • one or more executions of the execution tree may be measured (e.g., characterized as a threat level) to determine a degree of threat or maliciousness (e.g., a trustworthiness value).
  • the measure i.e., threat or trustworthiness value
  • the measure may be compared against a threat or trustworthiness threshold to determine whether the action represents an untrusted action (e.g., malicious behavior).
  • all executions, paths, and nodes of the execution path may be measured to determine the degree of threat or maliciousness.
  • each individual step e.g., the deltas between the graphs
  • each individual step may be measured to determine a degree of threat of each step.
  • the entire execution tree at various points in time may also be measured to determine a degree of threat of the steps in combination.
  • the first path (e.g., at time T+1 where time T begins at the 1 st Iteration 1602 ) indicates that the execution path of the object within the virtualization environment and the execution path of the object within the emulation environment are similar.
  • the object may load or make a call to Active X.
  • the second path (e.g., at time T+2) at the nth iteration indicates that the next execution path of the object within the virtualization environment and the execution path of the object within the emulation environment remain similar.
  • the third path (e.g., at time T+3), indicates that there is a divergence of the execution path of the object in the virtualization environment when compared to the execution path of the object in the emulation environment.
  • the object in the virtualization environment may have detected virtualization, a missing resource, or evidence of a security application and started behaving in a different manner.
  • the control module 310 may re-instantiate the virtualization environment.
  • the control module 310 may load the state of the previous virtualization environment or the state of the emulation environment in the newly re-instantiated virtualization environment.
  • the state may include information of the virtualization environment and/or the emulation environment immediately before or at the time of divergence.
  • the original virtualization environment and the emulation environment may continue without termination to further assess the execution path of the object in both environments.
  • the object may continue to operate in a similar manner or to perform slightly different actions.
  • graph 1610 is identified as “final,” there may be any number of paths over time before termination of the execution path (e.g., either by the object, the virtualization module 306 , emulation module 308 , or the control module 310 ).
  • control module 310 may determine divergence of the execution trees between the object in the virtualization environment and the emulation environment as the steps in one or the other environments occur or once processing within one or both environments terminates.
  • the emulation module 308 may begin processing the object at any point in time.
  • the virtualization module 306 may track when suspicious behavior occurred.
  • the emulation module 308 may be configured to provision the emulation environment with at least some of the resources of the virtualization module 306 including the states of the virtualization environment at the time of suspicious behavior.
  • the emulation module 308 may then begin processing the object immediately before or at the time of suspicious behavior.
  • FIG. 17 is an exemplary process 1700 for a hierarchical reasoning engine (HRE) in some embodiments.
  • the HRE extracts significant instructions from a series of actions of an object under assessment.
  • the HRE identifies sets of actions (e.g., resource requests and/or operations) that may be associated with malicious activities.
  • the set of actions may be in any order or may, in part, depend upon the order of instructions.
  • the HRE may identify significant patterns based upon the set of instructions. For example, the HRE may compare sets or subsets of instructions against a table or other data structure that contains sets or subsets of instructions that indicate maliciousness. In some embodiments, the HRE may compare sets or subsets of instructions against a table or other data structure that contains sets or subsets of instructions that indicate trustworthiness.
  • the HRE may calculate a likelihood for a pattern indicative of trust or maliciousness may occur.
  • the HRE may take many different types of information (e.g., statistics, heuristics, metadata, and the like) into account to determine a likelihood.
  • the HRE may determine a value or bias a threat value based on the likelihood that the set of actions is malicious.
  • the HRE may monitor operations of an object within an emulation environment.
  • the HRE may track the actions of the object and compare a set of the object's actions to known malicious sets of actions. If the object's actions match a set of actions that are known to be malicious, then the HRE may flag the object as malicious, and update a threat index to indicate an increased likelihood of maliciousness.
  • one or more emulation environment(s) and/or virtualization environment(s) e.g., all or a subset of environments
  • the objects may continue to be assessed in one or more virtualization environment(s) and/or emulation environment(s) for more information.
  • the HRE may provide another level of information that may identify likelihood of maliciousness and further provide better information regarding the object's risk or trustworthiness.
  • a user may set a preference for an acceptable level of risk to accept or reject data based on the level of trustworthiness calculated by systems and methods described herein.
  • the above-described functions and components can be comprised of instructions that are stored on a storage medium such as a computer readable medium.
  • the instructions can be retrieved and executed by a processor.
  • Some examples of instructions are software, program code, and firmware.
  • Some examples of storage medium are memory devices, tape, disks, integrated circuits, and servers.
  • the instructions are operational when executed by the processor to direct the processor to operate in accord with embodiments of the present invention. Those skilled in the art are familiar with instructions, processor(s), and storage medium.

Abstract

Systems and methods for virtualization and emulation assisted malware detection are described. In some embodiments, a method comprises intercepting an object; instantiating and processing the object in a virtualization environment; tracing operations of the object while processing within the virtualization environment; detecting suspicious behavior associated with the object; instantiating an emulation environment in response to the detected suspicious behavior; processing, recording responses to, and tracing operations of the object within the emulation environment; detecting a divergence between the traced operations of the object within the virtualization environment to the traced operations of the object within the emulation environment; re-instantiating the virtualization environment; providing the recorded response from the emulation environment to the object in the virtualization environment; monitoring the operations of the object within the re-instantiation of the virtualization environment; identifying untrusted actions from the monitored operations; and generating a report regarding the identified untrusted actions of the object.

Description

CROSS-REFERENCE
This application is related to and incorporates by reference U.S. nonprovisional application Ser. No. 13/288,917, filed Nov. 3, 2011, and titled “Systems and Methods for Virtualized Malware Detection.”
BACKGROUND
1. Field of the Invention
The present invention(s) generally relate to malware detection. More particularly, the invention(s) relate to systems and methods for virtualization and emulation assisted malware detection.
2. Description of Related Art
Malware and advanced persistent attacks are growing in number as well as damage. In 2010, the rise of targeted attacks included armored variations of Conficker.D and Stuxnet (which was referred to as the most advanced piece of malware ever created). Targeted attacks on organizations such as Google Inc., Intel Corporation, Adobe Systems Inc., The Boeing Company and an estimated 60 others have been extensively covered in the press. The state of the art security defenses have proven ineffective.
Cyber-criminals conduct methodical reconnaissance of potential victims to identify traffic patterns and existing defenses. Very sophisticated attacks involve multiple “agents” that individually appear to be legitimate traffic, then remain persistent in the target's network. The arrival of other agents may also be undetected, but when all are in the target network, these agents can work together to compromise security and steal targeted information. Legacy security solutions use a structured process (e.g., signature and heuristics matching) or analyze agent behavior in an isolated context, without the ability to detect future coordinated activity. As a result, legacy security solutions are not able to detect sophisticated malware that is armored, component based, and/or includes different forms of delayed execution.
SUMMARY OF THE INVENTION
Systems and methods for virtualized malware detection are described. In some embodiments, a method comprises intercepting an object provided from a first digital device to a second digital device, determining one or more resources the object requires when the object is executed, instantiating a virtual environment with the one or more resources, processing the object within the virtual environment, tainting operations of the object within the virtual environment, monitoring the operations of the object while processing within the virtual environment, identifying an additional resource of the object while processing that is not provided in the virtual environment, re-instantiating the virtual environment with the additional resource as well as the one or more resources, monitoring the operations of the object while processing within the re-instantiated virtual environment, identifying untrusted actions from the monitored operations, and generating a report identifying the operations and the untrusted actions of the object.
The object may comprise an executable file, a batch file, or a data file.
The method may further comprise performing a heuristic process on the object and determining the one or more resources the object requires based on the result of the heuristic process. Determining the one or more resources the object requires may be based on metadata associated with the object. The one or more resources may include one or more applications.
Generating the report identifying the operations and the untrusted actions of the object may comprise generating a signature to be used to detect malware. In some embodiments, generating the report identifying the operations and the untrusted actions of the object may comprise identifying a vulnerability in an application based on the operations and the untrusted actions of the object.
Re-instantiating the virtual environment with the additional resource as well as the one or more resources may comprise instantiating a second instance of a virtual environment with at least one resource that is different than a resource available in the prior virtual environment. Further, the method may comprise comparing identified monitored operations of the prior virtual environment to operations monitored in the second instance of the virtual environment. Generating the report may comprise generating the report based, at least in part, on the comparison.
The method may further comprise increasing or decreasing a frequency of a clock signal within the virtual environment. In some embodiments, the method may comprise logging a state of the virtual environment while monitoring the operations of the object. Further, re-instantiating the virtual environment with the additional resource as well as the one or more resources may comprise halting the virtual environment and re-instantiating the virtual environment with the logged state.
An exemplary system may comprise a collection module, a virtualization module, a control module, and a report module. The collection module may be configured to receive an object provided from a first digital device to a second digital device. The virtualization module may be configured to instantiate a virtual environment with the one or more resources, to process the object within the virtual environment, to identify an additional resource of the object while processing that is not provided in the virtual environment, to re-instantiate the virtual environment with the additional resource as well as the one or more resources, and to taint operations of the object within the virtual environment. The control module may be configured to determine one or more resources the object requires when the object is processed, to monitor the operations of the object while processing within the virtual environment, to monitor the operations of the object while processing within the re-instantiated virtual environment, and to identify untrusted actions from the monitored operations. The report module may be configured to generate a report identifying the operations and the untrusted actions of the object.
An exemplary computer readable medium may comprise instructions. The instructions may be executable by a processor for performing a method. The method may comprise intercepting an object provided from a first digital device to a second digital device, determining one or more resources the object requires when the object is executed, instantiating a virtual environment with the one or more resources, processing the object within the virtual environment, tainting operations of the object within the virtual environment, monitoring the operations of the object while processing within the virtual environment, identifying an additional resource of the object while processing that is not provided in the virtual environment, re-instantiating the virtual environment with the additional resource as well as the one or more resources, monitoring the operations of the object while processing within the re-instantiated virtual environment, identifying untrusted actions from the monitored operations, and generating a report identifying the operations and the untrusted actions of the object.
Systems and methods for virtualization and emulation malware detection are described. In some embodiments, a method comprises intercepting an object provided from a first digital device to a second digital device, instantiating a virtualization environment with the one or more resources, processing the object within the virtualization environment, tracing operations of the object while processing within the virtualization environment, detecting suspicious behavior associated with the object in the virtualization environment, instantiating an emulation environment in response to the detected suspicious behavior, processing the object within the emulation environment, recording responses to the object within the emulation environment, tracing operations of the object while processing within the emulation environment, detecting a divergence between the traced operations of the object within the virtualization environment and the traced operations of the object within the emulation environment, re-instantiating the virtualization environment in response to the detected divergence, providing the recorded response from the emulation environment to the object in the re-instantiated virtualization environment, monitoring the operations of the object while processing within the re-instantiation of the virtualization environment, identifying untrusted actions from the monitored operations, and generating a report regarding the identified untrusted actions of the object.
In various embodiments, the suspicious behavior comprises the object loading data into memory within the virtualization environment but not utilizing the data, the object scanning locations in memory of the virtualization environment and then terminating operations, or the object abruptly halting operations.
Trace capturing may be performed in a kernel of a digital device hosting the emulation environment. The method may further comprise increasing or decreasing a frequency of a clock signal within the emulation environment.
Re-instantiating the virtualization environment in response to the detected divergence may comprise instantiating a modified image of the virtualization environment. Re-instantiating the virtualization environment in response to the detected divergence may comprise halting the virtualization environment and restarting the virtualization environment.
In some embodiments, the method may further comprise applying state information from the emulation environment to the re-instantiated virtualization environment. The virtualization environment may be re-instantiated at a point in time where divergence is detected between the virtualization environment and the emulation environment.
An exemplary system may comprise a collection module, a virtualization module, an emulation module, and a control module. The collection module may be configured to receive an object provided from a first digital device to a second digital device. The virtualization module may be configured to instantiate a virtualization environment with the one or more resources, to process the object within the virtualization environment, to trace operations of the object while processing within the virtualization environment, to detect suspicious behavior associated with the object in the virtualization environment, to monitor the operations of the object while processing within a re-instantiation of the virtualization environment, to identify untrusted actions from the monitored operations, and to generate a report regarding the identified untrusted actions of the object. The emulation module may be configured to instantiate an emulation environment in response to the detected suspicious behavior, to process the object within the emulation environment, to record responses to the object within the emulation environment and to trace operations of the object while processing within the emulation environment. The control module may be configured to detect a divergence between the traced operations of the object within the virtualization environment and the traced operations of the object within the emulation environment, to re-instantiate the virtualization environment in response to the detected divergence, and to provide the recorded response from the emulation environment to the object in the virtualization environment.
An exemplary computer readable medium may comprise instructions. The instructions may be executable by a processor for performing a method. The method may comprise intercepting an object provided from a first digital device to a second digital device, instantiating a virtualization environment with the one or more resources, processing the object within the virtualization environment, tracing operations of the object while processing within the virtualization environment, detecting suspicious behavior associated with the object in the virtualization environment, instantiating an emulation environment in response to the detected suspicious behavior, processing the object within the emulation environment, recording responses to the object within the emulation environment, tracing operations of the object while processing within the emulation environment, detecting a divergence between the traced operations of the object within the virtualization environment to the traced operations of the object within the emulation environment, re-instantiating the virtualization environment in response to the detected divergence, providing the recorded response from the emulation environment and the object in the virtualization environment, monitoring the operations of the object while processing within the re-instantiation of the virtualization environment, identifying untrusted actions from the monitored operations, and generating a report regarding the identified untrusted actions of the object.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a diagram of an environment in which some embodiments may be practiced.
FIG. 2 is a flow diagram of an exemplary process for detection of malware and subsequent reporting in some embodiments.
FIG. 3 is a block diagram of an exemplary security server in some embodiments.
FIG. 4 is a conceptual block diagram of a virtualization module in some embodiments.
FIG. 5 is a block diagram of an exemplary virtualization module in some embodiments.
FIG. 6 is an exemplary virtualization environment for detection of malware in some embodiments.
FIG. 7 is a flow diagram of an exemplary malware detection method.
FIG. 8 is a flow diagram of an exemplary method of controlling a virtualization environment to detect malware.
FIG. 9 is a flow diagram of an exemplary model to detect malware through multiple virtualization environments.
FIG. 10 is a block diagram of an exemplary digital device.
FIG. 11 is a conceptual block diagram of an emulation environment in some embodiments.
FIG. 12 is a block diagram of an exemplary emulation module in some embodiments.
FIG. 13 is a flow diagram of an exemplary malware detection method utilizing an emulation environment in some embodiments.
FIG. 14 is an exemplary emulation environment for detection of malware in some embodiments.
FIG. 15 is a trace diagram of operations by or for an object in an emulation environment in some embodiments.
FIG. 16 is a block diagram of divergence detection between a virtualization environment and an emulation environment in some embodiments.
FIG. 17 is an exemplary process for a hierarchical reasoning engine (HRE) in some embodiments.
DETAILED DESCRIPTION OF THE INVENTION
Some embodiments of systems and methods described herein describe appliance-based solutions to protect enterprises, governments, and cloud infrastructures against targeted sophisticated attacks with corporate espionage or possibly cyber warfare objectives. By watching patterns of abnormal traffic, various systems and methods described herein may predict interactions, identify vulnerabilities, and predictably deny particular protocols, data, or network paths to developing malware.
An exemplary system comprises a heuristics engine, an instrumented execution infrastructure, and an intelligent engine. The heuristics engine may identify payloads that require further static and dynamic analysis. The dynamic and instrumented execution infrastructure may combine both virtualization and emulation environments. The environments may be constantly updated dynamically to enable “suspect” traffic to execute to its fullest extent through divergence detection and distributed interaction correlation. The intelligent engine may exchange and cross-reference data between “on the fly” spawned virtual environments and emulated environments allowing, for example, the implementation of such resources as modified nested page tables. As a result, the virtualization environment may recreate all or part of the end-user environment as well as a fully optimized environment to extract the full execution and behavior of potential malware. A contextual environment may also be created to allow analysis of targeted malware built with armoring capabilities such as anti-virtualization, or anti-debugging technologies.
FIG. 1 is a diagram of an environment 100 in which some embodiments may be practiced. Systems and methods embodied in the environment 100 may detect malicious activity, identify malware, identify exploits, take preventive action, generate signatures, generate reports, determine malicious behavior, determine targeted information, recommend steps to prevent attack, and/or provide recommendations to improve security. The environment 100 comprises a data center network 102 and a production network 104 that communicate over a communication network 106. The data center network 102 comprises a security server 108. The production network 104 comprises a plurality of end user devices 110. The security server 108 and the end user devices 110 may comprise digital devices. A digital device is any device with a processor and memory. An embodiment of a digital device is depicted in FIG. 10.
The security server 108 is a digital device configured to identify malware and/or suspicious behavior by running virtualized and emulated environments and monitoring behavior of suspicious data within the virtualized and emulated environments. In various embodiments, the security server 108 receives suspicious data from one or more data collectors. The data collectors may be resident within or in communication with network devices such as Intrusion Prevention System (IPS) collectors 112 a and 112 b, firewalls 114 a and 114 b, Internet content adaptation protocol/web cache communication protocol (ICAP/WCCP) collectors 116, milter mail plug-in collectors 118, switch collectors 120, and/or access points 124. Those skilled in the art will appreciate that a collector and a network device may be two separate digital devices (e.g., see firewall collector and intrusion detection system collector).
In various embodiments, data collectors may be at one or more points within the communication network 106. A data collector, which may include a test access point (TAP) or switch port analyzer (SPAN) port (e.g., SPAN port/IDS at switch 120) for example, is configured to intercept network data from a network. The data collector may be configured to identify suspicious data. Suspicious data is any data collected by the data collector that has been flagged as suspicious by the data collector and/or any data that is to be processed within the virtualization environment.
The data collectors may filter the data before flagging the data as suspicious and/or providing the collected data to the security server 108. For example, the data collectors may filter out plain text but collect executables or batches. Further, in various embodiments, the data collectors may perform intelligent collecting. For example, data may be hashed and compared to a whitelist. The whitelist may identify data that is safe. In one example, the whitelist may identify digitally signed data or data received from a known trusted source as safe. Further, the whitelist may identify previously received information that has been determined to be safe. If data has been previously received, tested within the environments, and determined to be sufficiently trustworthy, the data collector may allow the data to continue through the network. Those skilled in the art will appreciate that the data collectors (or agents associated with the data collectors) may be updated by the security server 108 to help the data collectors recognize sufficiently trustworthy data and to take corrective action (e.g., quarantine and alert an administrator) if untrustworthy data is recognized. In some embodiments, if data is not identified as safe, the data collectors may flag the data as suspicious for further assessment.
Those skilled in the art will appreciate that one or more agents or other modules may monitor network traffic for common behaviors and may configure a data collector to collect data when data is directed in a manner that falls outside normal parameters. For example, the agent may determine or be configured to appreciate that a computer has been deactivated, a particular computer does not typically receive any data, or data received by a particular computer typically comes from a limited number of sources. If data is directed to a digital device in a manner that is not typical, the data collector may flag such data as suspicious and provide the suspicious data to the security server 108.
Network devices include any device configured to receive and provide data over a network. Examples of network devices include, but are not limited to, routers, bridges, security appliances, firewalls, web servers, mail servers, wireless access points (e.g., hotspots), and switches. In some embodiments, network devices include IPS collectors 112 a and 112 b, firewalls 114 a and 114 b, ICAP/WCCP servers 116, devices including milter mail plug-ins 118, switches 120, and/or access points 124.
The IPS collectors 112 a and 112 b may include any anti-malware device including IPS systems, intrusion detection and prevention systems (IDPS), or any other kind of network security appliances.
The firewalls 114 a and 114 b may include software and/or hardware firewalls. In some embodiments, the firewalls 114 a and 114 b may be embodied within routers, access points, servers (e.g., web servers), or appliances.
ICAP/WCCP servers 116 include any web server or web proxy server configured to allow access to a network and/or the Internet. Network devices including milter mail plug-ins 118 may include any mail server or device that provides mail and/or filtering functions and may include digital devices that implement milter, mail transfer agents (MTAs), sendmail, and postfix, for example.
Switches 120 include any switch or router. In some examples, the data collector may be implemented as a TAP, SPAN port, and/or intrusion detection system (IDS). Access points 124 include any device configured to provide wireless connectivity with one or more other digital devices.
The production network 104 is any network that allows one or more end user devices 110 to communicate over the communication network 106. The communication network 106 is any network that may carry data (encoded, compressed, and/or otherwise) from one digital device to another. In some examples, the communication network 106 may comprise a LAN and/or WAN. Further, the communication network 106 may comprise any number of networks. In some embodiments, the communication network 106 is the Internet.
FIG. 1 is exemplary and does not limit systems and methods described herein to the use of only those technologies depicted. For example, data collectors may be implemented in any web or web proxy server and is not limited to only the servers that implement ICAP and/or WCCP. Similarly, collectors may be implemented in any mail server and is not limited to mail servers that implement milter. Data collectors may be implemented at any point in one or more networks.
Those skilled in the art will appreciate that although FIG. 1 depicts a limited number of digital devices, collectors, routers, access points, and firewalls, there may be any kind and number of devices. For example, there may be any number of security servers 108, end user devices 110, IPS collectors 112 a and 112 b, firewalls 114 a and 114 b, ICAP/WCCP collectors 116, milter mail plug-ins 118, switches 120, and/or access points 124. Further, there may be any number of data center networks 102 and/or production networks 104.
FIG. 2 is a flow diagram of an exemplary process 200 for detection of malware and subsequent reporting in some embodiments. In step 202, suspect traffic is identified. In various embodiments, any network device may be used to monitor and/or collect network traffic for further assessment. In various embodiments, the network device and/or another digital device (e.g., the security server 108) applies heuristics and/or rules (e.g., comparison of data to a whitelist and/or a blacklist) to identify suspicious data. Those skilled in the art will appreciate that any technique may be used to flag network traffic as suspicious. For example, the security server 108 may flag data as suspicious if the data is directed towards a known infected computer, a disabled account, or any untrustworthy destination. Further, for example, the security server 108 may flag data as suspicious if the data came from a suspected source of malware or a source that is known to be untrustworthy (e.g., a previously identified botnet server). In another example, the data collector and/or agent associated with the data collector may perform packet analysis to identify suspicious characteristics in the collected data including the header, footer, destination IP, origin IP, payload and the like.
In step 204, suspect data and/or suspect processes are tested in one or more virtualization environments for “out of context” behavior analysis of the suspicious data and suspect processes. In some embodiments, the suspect data and/or processes are initially virtualized in a set of virtualization environments. Each different virtualization environment may be provisioned differently (e.g., each different virtualization environment may comprise different resources). The initial set of resources for a virtualization environment may be predetermined based on common resources required for processing the data and/or metadata associated with the data. If the suspect data and/or suspect process are determined to be behaving suspiciously in the virtualization environment, the suspect data and/or process may also be processed in an emulation environment as discussed here.
In various embodiments, the suspect data and/or process is analyzed with multiple virtualization environments to extend predictive analysis to distributed and application interactions as described further herein. The suspect data and/or process may be identified as malware or may behave in an untrusted manner in the virtualized environment. In order to further assess the data and/or process, the data and/or process may be processed in a plurality of different virtualization environments with different resources and different limitations. Those skilled in the art will appreciate that the suspicious data and/or process may or may not be further tested after the initial set of environments.
In step 206, contextual behavioral analysis is conducted on the suspect data and suspect processes using one or more emulation environments. In some embodiments, if the suspicious data acts suspiciously in one or more virtualization environments (e.g., halting execution without performing functions, storing data without using the data, and the like), the data is processed in one or more emulation environments. The emulation environment may be provisioned based on commonly needed resources, metadata associated with the suspicious data, and/or resources identified as needed during processing of the suspicious data within the virtualization environment. The suspicious data may have direct access to memory data in the emulation environment. The behavior of the suspicious data may be monitored within the emulation environment.
In step 208, exploits are identified and validated based on the behavior of the suspect data or suspect process in the environments. For example, the virtualization and/or emulation environments may be provisioned with various applications and operating systems in order to monitor the behavior of the suspect data or suspect process. As a result, the environments may test suspect data or suspect processes against network resources and/or applications to determine vulnerabilities and malicious actions. As a result, the assessment of the suspect data and/or process may extend predictive analysis to applications for a fuller or complete identification of targeted vulnerabilities.
In some embodiments, when a divergence is detected between the behavior of suspect data and/or process in the virtualization environment and the emulation environment, the virtualization environment may be dynamically re-instantiated and re-provisioned (e.g., the process returns to step 204 with the re-instantiated and/or re-provisioned virtualization environment(s)). Data from the emulation environment (e.g., responses from within the emulation environment) may be injected into the re-provisioned virtualization environment at or close to the time of divergence to enable further execution of the suspect data and assessment of related data.
In step 210, a report is generated that may identify threats and vulnerabilities based on the monitored behaviors of the suspect data and the suspect processes within the testing environments. In various embodiments, the report may include a description of exploits, vulnerabilities of applications or operating systems, behaviors of the suspect data, payloads associated with the suspect data, command and control protocols, and probable targets of the suspect data (e.g., what valuable information the suspicious data was attempting to steal). Further, the report may include heuristics, additions to whitelists, additions to blacklists, statistics, or signatures designed to detect the suspect data.
In various embodiments, the exemplary process 200 may be used to detect distributed attacks characteristic of advanced persistent threats. One exemplary scenario of a distributed attack is that an attacker may send a package to be stored in a specific location in the target computer. The package and the act of storing the package may be benign. The attacker may, over time, subsequently send an attack program. Without the previously stored package, the attack program may also appear benign and may not be detectable as malware by preexisting security solutions. Once the attack program retrieves the previously stored package, however, the attack program may attack the target system (e.g., exploit a vulnerability in the operating system to take over the target computer or copy valuable data).
In various embodiments, the security server 108 may first receive and test a package in at least one of the different environments. A report or other characteristic of the storage (e.g., the location of the stored data and the stored data) may be logged and stored for later testing within the environments. For example, an object that stores a package in memory but does not refer to the package after storage may be deemed to be suspicious. As such, the object may be tested in a variety of different environments and/or the package may be stored (e.g., in a protected long term storage memory such as a hard drive). When the security server 108 subsequently receives the attack program and, during testing, notes that the attack program is suspiciously checking a particular location in memory for data, the security server 108 may recognize that the previously stored package was stored in that particular location of memory. The security server 108 may retrieve the previously received package and store the package within the location in memory in one of the environments and retest the attack program. If the attack program acts maliciously after receiving the package, the security server 108 may generate a report (e.g., information, signature file, heuristic, and/or the like) to identify the package as well as the attack program in order to protect against similar attacks. Moreover, the security server 108 may generate a report identifying the exploited vulnerability so that the vulnerability may be corrected (e.g., the operating system patched or upgraded to correct the exploit). The security server 108 may also generate a report identifying the targeted information (e.g., a password file or file of credit card numbers) so that corrective action may be taken (e.g., move the file or encrypt the information).
FIG. 3 is a block diagram of an exemplary security server 108 in some embodiments. In various embodiments, the security server 108 leverages both virtualization and emulation systems and methods to detect malware anti-virtualization protections and accelerate “on-demand” virtualized environments for faster prediction. The security server 108 comprises a collection module 302, a data flagging module 304, a virtualization module 306, an emulation module 308, a control module 310, a reporting module 312, a signature module 314, and a quarantine module 316.
The collection module 302 is configured to receive network data (e.g., potentially suspicious data) from one or more sources. Network data is data that is provided on a network from one digital device to another. The collection module 302 may flag the network data as suspicious data based on, for example, whitelists, blacklists, heuristic analysis, statistical analysis, rules, and/or atypical behavior. In some embodiments, the sources comprise data collectors configured to receive network data. For example, firewalls, IPS, servers, routers, switches, access points and the like may, either individually or collectively, function as or include a data collector. The data collector may forward network data to the collection module 302.
In some embodiments, the data collectors filter the data before providing the data to the collection module 302. For example, the data collector may be configured to collect or intercept data that includes executables and batch files. In some embodiments, the data collector may be configured to follow configured rules. For example, if data is directed between two known and trustworthy sources (e.g., the data is communicated between two device on a whitelist), the data collector may not collect the data. In various embodiments, a rule may be configured to intercept a class of data (e.g., all documents created with MICROSOFT® Word software that may include macros or data that may comprise a script). In some embodiments, rules may be configured to target a class of attack or payload based on the type of malware attacks on the target network in the past. In some embodiments, the security server 108 may make recommendations (e.g., via the reporting module 312) and/or configure rules for the collection module 302 and/or the data collectors. Those skilled in the art will appreciate that the data collectors may comprise any number of rules regarding when data is collected or what data is collected.
In some embodiments, the data collectors located at various positions in the network may not perform any assessment or determination regarding whether the collected data is suspicious or trustworthy. For example, the data collector may collect all or a portion of the network data and provide the collected network data to the collection module 302 which may perform filtering.
The data flagging module 304 may perform one or more assessments to the collected data received by the collection module 302 and/or the data collector to determine if the intercepted network data is suspicious. The data flagging module 304 may apply rules as discussed herein to determine if the collected data should be flagged as suspicious. In various embodiments, the data flagging module 304 may hash the data and/or compare the data to a whitelist to identify the data as acceptable. If the data is not associated with the whitelist, the data flagging module 304 may flag the data as suspicious.
In various embodiments, collected network data may be initially identified as suspicious until determined otherwise (e.g., associated with a whitelist) or heuristics find no reason that the network data should be flagged as suspicious. In some embodiments, the data flagging module 304 may perform packet analysis to look for suspicious characteristics in the header, footer, destination IP, origin IP, payload, and the like. Those skilled in the art will appreciate that the data flagging module 304 may perform a heuristic analysis, a statistical analysis, and/or signature identification (e.g., signature-based detection involves searching for known patterns of suspicious data within the collected data's code) to determine if the collected network data is suspicious.
The data flagging module 304 may be resident at the data collector, at the security server 108, partially at the data collector, partially at the security server 108, or on a network device. For example, a router may comprise a data collector and a data flagging module 304 configured to perform one or more heuristic assessments on the collected network data. If the collected network data is determined to be suspicious, the router may direct the collected data to the security server 108.
In various embodiments, the data flagging module 304 may be updated. In one example, the security server 108 may provide new entries for a whitelist, entries for a blacklist, heuristic algorithms, statistical algorithms, updated rules, and/or new signatures to assist the data flagging module 304 to determine if network data is suspicious. The whitelists, entries for whitelists, blacklists, entries for blacklists, heuristic algorithms, statistical algorithms, and/or new signatures may be generated by one or more security servers 108 (e.g., via the reporting module 312).
The virtualization module 306 and emulation module 308 may analyze suspicious data for untrusted behavior (e.g., malware or distributed attacks). The virtualization module 306 is configured to instantiate one or more virtualized environments to process and monitor suspicious data. Within the virtualization environment, the suspicious data may operate as if within a target digital device. The virtualization module 306 may monitor the operations of the suspicious data within the virtualization environment to determine that the suspicious data is probably trustworthy, malware, or requiring further action (e.g., further monitoring in one or more other virtualization environments and/or monitoring within one or more emulation environments). In various embodiments, the virtualization module 306 monitors modifications to a system, checks outbound calls, and checks tainted data interactions.
In some embodiments, the virtualization module 306 may determine that suspicious data is malware but continue to process the suspicious data to generate a full picture of the malware, identify the vector of attack, determine the type, extent, and scope of the malware's payload, determine the target of the attack, and detect if the malware is to work with any other malware. In this way, the security server 108 may extend predictive analysis to actual applications for complete validation. A report may be generated (e.g., by the reporting module 312) describing the malware, identify vulnerabilities, generate or update signatures for the malware, generate or update heuristics or statistics for malware detection, and/or generate a report identifying the targeted information (e.g., credit card numbers, passwords, or personal information).
In some embodiments, the virtualization module 306 may flag suspicious data as requiring further emulation and analytics in the back end if the data has suspicious behavior such as, but not limited to, preparing an executable that is not executed, performing functions without result, processing that suddenly terminates, loading data into memory that is not accessed or otherwise executed, scanning ports, or checking in specific potions of memory when those locations in memory may be empty. The virtualization module 306 may monitor the operations performed by or for the suspicious data and perform a variety of checks to determine if the suspicious data is behaving in a suspicious manner.
The emulation module 308 is configured to process suspicious data in an emulated environment. Those skilled in the art will appreciate that malware may require resources that are not available or may detect a virtualized environment. When malware requires unavailable resources, the malware may “go benign” or act in a non-harmful manner. In another example, malware may detect a virtualized environment by scanning for specific files and/or memory necessary for hypervisor, kernel, or other virtualization data to execute. If malware scans portions of its environment and determines that a virtualization environment may be running, the malware may “go benign” and either terminate or perform nonthreatening functions.
In some embodiments, the emulation module 308 processes data flagged as behaving suspiciously by the virtualization environment. The emulation module 308 may process the suspicious data in a bare metal environment where the suspicious data may have direct memory access. The behavior of the suspicious data as well as the behavior of the emulation environment may be monitored and/or logged to track the suspicious data's operations. For example, the emulation module 308 may track what resources (e.g., applications and/or operating system files) are called in processing the suspicious data.
In various embodiments, the emulation module 308 records responses to the suspicious data in the emulation environment. If a divergence in the operations of the suspicious data between the virtualization environment and the emulation environment is detected, the virtualization environment may be configured to inject the response from the emulation environment. The suspicious data may receive the expected response within the virtualization environment and continue to operate as if the suspicious data was within the targeted digital device. This process is further described herein.
The control module 310 synchronizes the virtualization module 306 and the emulation module 308. In some embodiments, the control module 310 synchronizes the virtualization and emulation environments. For example, the control module 310 may direct the virtualization module 306 to instantiate a plurality of different virtualization environments with different resources. The control module 310 may compare the operations of different virtualization environments to each other in order to track points of divergence. For example, the control module 310 may identify suspicious data as operating in one manner when the virtualization environment includes INTERNET EXPLORER® browser v. 7.0 or v. 8.0, but operating in a different manner when interacting with INTERNET EXPLORER® browser v. 6.0 (e.g., when the suspicious data exploits a vulnerability that may be present in one version of an application but not present in another version).
The control module 310 may track operations in one or more virtualization environments and one or more emulation environments. For example, the control module 310 may identify when the suspicious data behaves differently in a virtualization environment in comparison with an emulation environment. Divergence and correlation analysis is when operations performed by or for suspicious data in a virtual environment is compared to operations performed by or for suspicious data in a different virtual environment or emulation environment. For example, the control module 310 may compare monitored steps of suspicious data in a virtual environment to monitored steps of the same suspicious data in an emulation environment. The functions or steps of or for the suspicious data may be similar but suddenly diverge. In one example, the suspicious data may have not detected evidence of a virtual environment in the emulation environment and, unlike the virtualized environment where the suspicious data went benign, the suspicious data undertakes actions characteristic of malware (e.g., hijacks a formerly trusted data or process).
When divergence is detected, the control module 310 may re-provision or instantiate a virtualization environment with information from the emulation environment (e.g., a page table including state information and/or response information further described herein) that may not be previously present in the original instantiation of the virtualization environment. The suspicious data may then be monitored in the new virtualization environment to further detect suspicious behavior or untrusted behavior. Those skilled in the art will appreciate that suspicious behavior of an object is behavior that may be untrusted or malicious. Untrusted behavior is behavior that indicates a significant threat.
In some embodiments, the control module 310 is configured to compare the operations of each virtualized environment in order to identify suspicious or untrusted behavior. For example, if the suspicious data takes different operations depending on the version of a browser or other specific resource when compared to other virtualized environments, the control module 310 may identify the suspicious data as malware. Once the control module 310 identifies the suspicious data as malware or otherwise untrusted, the control module 310 may continue to monitor the virtualized environment to determine the vector of attack of the malware, the payload of the malware, and the target (e.g., control of the digital device, password access, credit card information access, and/or ability to install a bot, keylogger, and/or rootkit). For example, the operations performed by and/or for the suspicious data may be monitored in order to further identify the malware, determine untrusted acts, and log the effect or probable effect.
The reporting module 312 is configured to generate reports based on the processing of the suspicious data of the virtualization module 306 and/or the emulation module 308. In various embodiments, the reporting module 312 generates a report to identify malware, one or more vectors of attack, one or more payloads, target of valuable data, vulnerabilities, command and control protocols, and/or behaviors that are characteristics of the malware. The reporting module 312 may also make recommendations to safeguard information based on the attack (e.g., move credit card information to a different digital device, require additional security such as VPN access only, or the like).
In some embodiments, the reporting module 312 generates malware information that may be used to identify malware or suspicious behavior. For example, the reporting module 312 may generate malware information based on the monitored information of the virtualization environment. The malware information may include a hash of the suspicious data or a characteristic of the operations of or for the suspicious data. In one example, the malware information may identify a class of suspicious behavior as being one or more steps being performed by or for suspicious data at specific times. As a result, suspicious data and/or malware may be identified based on the malware information without virtualizing or emulating an entire attack.
The optional signature module 314 is configured to store signature files that may be used to identify malware. The signature files may be generated by the reporting module 312 and/or the signature module 314. In various embodiments, the security server 108 may generate signatures, malware information, whitelist entries, and/or blacklist entries to share with other security servers. As a result, the signature module 314 may include signatures generated by other security servers or other digital devices. Those skilled in the art will appreciate that the signature module 314 may include signatures generated from a variety of different sources including, but not limited to, other security firms, antivirus companies, and/or other third-parties.
In various embodiments, the signature module 314 may provide signatures which are used to determine if network data is suspicious or is malware. For example, if network data matches the signature of known malware, then the network data may be classified as malware. If network data matches a signature that is suspicious, then the network data may be flagged as suspicious data. The malware and/or the suspicious data may be processed within a virtualization environment and/or the emulation environment as discussed herein.
The quarantine module 316 is configured to quarantine suspicious data and/or network data. In various embodiments, when the security serer 108 identifies malware or probable malware, the quarantine module 316 may quarantine the suspicious data, network data, and/or any data associated with the suspicious data and/or network data. For example, the quarantine module 316 may quarantine all data from a particular digital device that has been identified as being infected or possibly infected.
In some embodiments, the quarantine module 316 is configured to alert a security administrator or the like (e.g., via email, call, voicemail, or SMS text message) when malware or possible malware has been found.
In various embodiments, the security server 108 allows an administrator or other personnel to log into the security server 108. In one example, the security server 108 provides a graphical user interface or other user interface that authenticates a user (e.g., via digital signature, password, username, and the like). After the user is authenticated, the security server 108 may allow the user to view the processing of the virtualization module 306 and the emulation module 306 including infection vectors, and vulnerability vectors. The security server 108 may also provide the user with threshold reasoning which is further described regarding FIG. 4.
FIG. 4 is a conceptual block diagram 400 of a virtualization module in some embodiments. In various embodiments, different processes 402 may be virtualized within one or more virtualization environments 404. The virtualization environments execute on a host 406 that runs over hardware 408 that is isolated from the suspicious data and/or processes. The control module 310 may identify various results to identify when suspicious behavior is present (e.g., value X), in what sequence the suspicious behavior occurs (e.g., value Y) and what process (e.g., value Z).
For example, a particular process 402 may be intercepted and tested in a variety of different virtualization environments 404. Each virtualization environment 404 may operate on a host 406 (e.g., operating system and/or virtual machine software) that executes over a digital device's hardware 408. The functions of the tested process may be isolated from the host 406 and hardware 408. Suspicious or untrusted behavior may be identified within the virtualization. A time of exploitation may be identified as value X, an exploited sequence may be identified as value Y, and a process of exploitation may be identified as value Z.
The X, Y, Z values may form a description of suspicious data or the process which may be used to measure the threat against a threat matrix. In some embodiments, an administrator may store a threat threshold, based on the threat matrix depending upon the level of risk that is acceptable. The threat matrix may be based on interactions with the operating system, time sequence, resources, or events. In some embodiments, the degree of malicious behavior may be determined based on a threat value (e.g., comprising a function including the X, Y, and Z values). In one example, the interactions with the OS, time sequences, types of interactions, and resources requested, may all be elements of the threat matrix. Once a threat value is determined, the threat value may be compared to a threat threshold to determine the degree of maliciousness and/or what actions will be taken. Those skilled in the art will appreciate that the threat threshold may be determined and/or generated based on an administrator's acceptable level of risk.
Time, sequence, and process values may be generated for each tested process or data. The time, sequence, and process values may be measured against the threshold using the threat matrix to determine a possible course of action (e.g., quarantine, generate a report, alert an administrator, or allow the process to continue unobstructed).
The X, Y, Z values may be compared to X, Y, Z values associated with the same suspicious data from the emulation environment. If the emulation environment values are different or divergent, further testing within the virtualization environment and/or the emulation environment may be required.
FIG. 5 is a block diagram of an exemplary virtualization module 306 in some embodiments. The virtualization module 306 may comprise a virtual machine module 502, a resource module 504, a monitor module 506, a taint module 508, a time module 510, a state module 512, and a state database 514.
The virtual machine module 502 is configured to generate one or more virtualization environments to process and monitor suspicious data. Those skilled in the art will appreciate that many different virtual machines may be used (e.g., VMWARE® virtual machines or custom virtual machines).
The resource module 504 is configured to provision one or more virtualization environments with plug-ins or other resources. In various embodiments, plug-ins are modules built in the virtual and emulation environments that collect specific data sets from certain system components. This process may be chained to follow an execution through the system or may run in parallel if there is a threaded malicious or clean object.
In some embodiments, the resource module 504 provisions a virtualization environment with an initial set of resources (e.g., operating system, OS updates, applications, and drivers). In some embodiments, the resource module 504 provisions virtualization environments to include resources based on the destination of the suspicious data (e.g., the digital device targeted to receive the suspicious data), device images provisioned by information technology management, or metadata associated with the suspicious data. In some embodiments, the resource module 504 comprises a pre-processing module that determines specific requirements based on network meta-data to determine which plug-ins should be implemented within the virtualization environment and in what combination the plug-ins may be launched.
In some embodiments, the resource module 504 provisions a virtualization environment based on the suspicious data's similarity to malware or other suspicious data. In one example, the virtualization module 306 may scan and find that the suspicious data appears to be similar to previously tested suspicious data or malware. Subsequently, the resource module 504 may provision one or more virtualization environments to include resources with known vulnerabilities to monitor whether the suspicious data acts in a similarly untrusted manner.
In various embodiments, the resource module 504 provisions a virtualization environment based in part on metadata associated with the suspicious data. For example, the virtualization module 306 may receive or retrieve metadata associated with the suspicious data. The resource module 504 may determine, based on the metadata, that one or more applications are required for the suspicious data to function. Subsequently, the resource module 504 may provision one or more virtualization environments with the necessary applications and related support file (e.g., operating system, shared resources, or drivers).
Those skilled in the art will appreciate that multiple virtualized environments may be instantiated. Each of the virtualized environments may have one or more different resources. In one example, one virtualized environment may include INTERNET EXPLORER® browser v. 6 while another virtualized environment may include INTERNET EXPLORER® browser v. 7. Different virtualized environments may include, in some embodiments, different browser programs (e.g., MOZILLA® FIREFOX® browser), different operating systems (e.g., UNIX® operating system), and/or different drivers. The different virtualization environments may have similar applications or operating systems but different versions or different patches or updates. In this way, the same suspicious data may be processed using different resources. If the suspect data behaves differently with one browser than with another, then there is evidence that the suspicious data may be malware.
In various embodiments, suspicious data is processed in a plurality of different virtualized environments where each of the different virtualized environments includes a limited number of differences. As a result, if malware is only effective in the presence of INTERNET EXPLORER® browser v. 6.0 (i.e., there is a vulnerability in INTERNET EXPLORER® browser v. 6.0 that the malware is programmed to exploit), then the malware's behavior as well as the exploit may be identified.
The control module 310 may provision the virtualization module 306. In some embodiments, the control module 310 may review metadata associated with the suspicious data to determine resources to be available in one or more virtualization environments. Those skilled in the art will appreciate that the metadata may come from a variety of sources. For example, some metadata may be apparent from the suspicious data such as a file extension or calls associated with the suspicious data. In some embodiments, the control module 310 may retrieve information regarding the suspicious data in order to provision the virtualization environment. For example, the control module 310 may determine that the suspicious data may be similar to other malware or suspicious data and provision one or more virtualized environments in a manner to see if the newly acquired suspicious data behaves in an untrusted manner.
The control module 310 may also provision the emulation module 308. In some embodiments, the control module 310 may review metadata associated with the suspicious data to determine resources to be available in one or more emulation environments. The control module 310 may also provision an emulation environment based on the provisioning of one or more virtualized environments. For example, the control module 310 may provision the emulation environment based on a virtualized environment where the suspicious data may have behaved abnormally (e.g., in an environment with a specific version of an operating system, the suspicious data scanned one or more areas of memory and then terminated further operations). The emulation environment may, in some embodiments, share similar resources as what was provided in a virtualization environment.
The virtualization module 306 and/or the collection module 302 may determine resource requirements of or for the suspicious data. In various embodiments, the virtualization module 306 receives metadata associated with the suspicious data to determine resources as described herein. For example, the metadata may indicate that the network data is an executable to be run in a WINDOWS® operating system environment or the metadata may indicate that the network data is an executable file to be operated by a browser (e.g., a web application). The virtualization module 306 and/or the control module 310 may dynamically select a variety of resources to provision and instantiate a virtualization environment in order to process the network data and monitor actions.
In various embodiments, a resource may be missing from one, some, or all of the virtualized environments. For example, the suspicious data may require a different application to be able to execute. In some embodiments, the virtualization module 306 may halt a virtualization environment, dynamically provision the virtualization environment with the necessary resources, and re-instantiate the virtualized environment to monitor for changes in behavior of the suspicious data.
The monitor module 506 is configured to monitor the virtualization environments instantiated by the virtual machine module 502. In various embodiments, the monitor module 506 logs each step or function performed by or for the suspicious data within each virtualization environment. In various embodiments, the monitor module 506 logs each operation of the suspicious data, logs changes caused by the operation (e.g., what information is stored in memory and where in memory the information is stored), and logs at what time the operation occurred.
The monitor module 506 may compare the operations of the suspicious data in various virtualization environments during or after virtualization. When a divergence is identified between a virtualization environment and an emulation environment or between two virtualization environments, the monitor module 506 may generate a flag or track the results to identify if different operations perform untrusted actions.
The taint module 508 is configured to perform taint analysis and/or other techniques to identify and track operations provided by and for the suspect data. As a result, acts associated with the suspicious data, including executions by the suspect data and executions performed by an application or operating system for the suspect data are tracked and logged. By using dynamic taint analysis, the taint module 508 and/or the monitor module 506 may monitor actions to detect whether a value that is normally derived from a trusted source is instead derived by some operation associated with the suspect data.
For example, values such as jump addresses and format strings should usually be supplied by the code itself, not from external untrusted inputs. However, an attacker may attempt to exploit a program by overwriting these values with their own data. In various embodiments, the taint module 508 may initially mark input data from untrusted sources tainted, then monitor program execution to track how the tainted attribute propagates (i.e., what other data becomes tainted) and to check when tainted data is used in dangerous ways (e.g., use of tainted data as jump addresses or format strings which may indicate an exploit of a vulnerability such as a buffer overrun or format string vulnerability). In various embodiments, based on the taint analysis, the monitor module 506 may look for variable, string, particular component and feedback that causes a jump in the code.
In various embodiments, the monitor module 506 and/or the taint module 508 may be plug-ins within the virtualization environment. In one example, the resource module 504 may provision a monitoring plug-in and a taint analysis plug-in with one or more virtualization environments.
Those skilled in the art will appreciate that the virtualization module 306 (e.g., via the monitor module 506) may detect attacks at time of use in the virtualized environment as well as at the time of writing to memory. In some embodiments, the virtualization module 306 detects when a certain part of memory is illegitimately overwritten by the suspicious data at the time of writing to the memory.
The time module 510 provides system resources as expected by the object creating the perception of accelerated time within the virtualization and/or emulation environments. By increasing or slowing clock signals and processing, the suspicious data may be analyzed in a more detailed manner and/or in a faster time than if the clock signal was allowed to operate in real time.
In some embodiments, malware requires a passage of time. For example, some malware requires seconds, minutes, days, or weeks to pass before becoming active. The time module 510 may increase the clock time in the virtualization or emulation environments in order to trigger suspicious behavior.
Further, the time module 510 can slow clock time within the virtualization and/or emulation environments. For example, the time module 510 may take time slices to specifically identify and characterize processes that are taken by or for the suspicious data. In some embodiments, time slice information may be used to isolate an attack vector, describe the suspicious data, or determine the target of the attack. For example, time slice information may indicate that at a certain time and associated step, the suspicious data takes over a formerly trusted process. This information may be used to characterize malware such that when other suspicious data take similar action at the same time and associated step, the suspicious data may be classified as a similar type of malware. The time module 510 may also segment operations by or for the object in the virtualization environment and the emulation environment to simplify comparisons of operations between the virtualization environment and the emulation environment.
In various embodiments, the state module 512 tracks the various states of the virtualization environment (e.g., the time, date, process, as well as what was stored in memory where it was stored and when). In some embodiments, the virtual machine module 502 may halt a virtualization environment or instantiate a new virtualization environment utilizing the states of a previous virtualization. For example, the state module 512 may monitor the behavior of suspicious data which suspiciously terminates at time T. The virtual machine module 502 may instantiate a new virtualization environment. The state module 512 may perform dynamic state modification to change the new virtualization environment to include the logged states of the previous virtualization environment at time T. In some embodiments, the state module 512 and/or the time module 510 may increase the clock signal, decrease the clock signal, or simply change the clock signal depending on the processing of the suspicious data that needs to occur. As a result, the suspicious data may be allowed to execute in a similar environment at the desired time. Those skilled in the art will appreciate that the new virtualization environment may be slightly different (e.g., include and/or not include one or more resources) from the previous virtualization environment. In some embodiments, the virtual machine module 502 does not instantiate a new virtualization environment but rather halts the previous virtualization environment and re-instantiates the previous virtualization environment at a previously logged state with one or more resources.
The state database 514 is a database configured to store the state of one or more virtualization environments and/or one or more emulation environments. Those skilled in the art will appreciate that the state database 514 is not limited to databases but may include any data structure.
Once the control module 310 identifies the suspicious data as malware or otherwise untrusted, the control module 310 may continue to monitor the virtualized environment to determine the vector of attack of the malware, the payload of the malware, and the target (e.g., control of the digital device, password access, credit card information access, and/or ability to install a bot, keylogger, and/or rootkit). For example, the operations performed by and/or for the suspicious data may be monitored in order to further identify the malware, determine untrusted acts, and log the effect or probable effect.
If the behavior of the suspicious data is also suspicious, the virtualization module 306 may halt the virtualization environment and provide new resources. For example, if the suspicious data begins to execute a program but abruptly halts, prepares to run an executable but does not actually run the executable, or constantly checks a section in memory that should typically be empty, then the virtualization module 306 may instantiate new virtualization environments and/or re-provision existing virtualization environments with different resources to see if the suspicious data acts differently. In various embodiments, the emulation module 308 may instantiate an emulation environment to test the suspicious data.
In various embodiments, the virtualization module 306 tracks different behaviors by different suspicious data in order to identify complex attacks, distributed attacks and/or advanced persistent threats (APT). For example, one type of malware may store an executable in a specific place in memory and then, possibly much later, a second type of malware may access the stored executable and attack a computerized system. The virtualization module 306 may identify and record the behavior of suspicious data which, when executed in a virtualization environment, only stores an executable in a specific place in memory but performs no other functions. If other data is executed in the virtualization environment which checks that specific place in memory, the virtualization module 306 may halt the virtualization, provision the executable from the previous data in the specific location in memory, and re-run the virtualization environment to monitor changes.
FIG. 6 is an exemplary virtualization environment 600 for detection of malware in some embodiments. The virtualization environment 600 comprises objects 602, a network 604, applications 606, operating system 608, a virtual machine 610, a hypervisor 612, a manager 614, a dynamic state manager 616, and a page table manager 618. Objects include, but are not limited to, suspicious data and/or processes that are tested in the virtualization environment 600. The network 604 comprises resources to allow the objects 602 to function and/or operate with access to network resources (e.g., network drivers and ports).
The applications 606 include one or more applications or other resources that function with the objects 602 to operate in the virtualization. The applications may include word processing applications, web browsers, applets, scripting engines, and the like. Different virtualization environments may include different applications and/or different versions. For example, one virtualization environment may comprise INTERNET EXPLORER® browser v. 9 while another virtualization environment may comprise MOZILLA® FIREFOX® browser v. 5.0. In another example, one virtualization environment may comprise INTERNET EXPLORER® browser v. 9 while three other virtualization environments may comprise v. 8, v. 7, and v. 6 of the INTERNET EXPLORER® browser, respectively.
The operating system 608 includes all or part of the operating system necessary for the objects 602 to function within the virtualization. The operating system may include, for example, the UBUNTU® LINUX®, WINDOWS XP®, or OS X MOUNTAIN LION™ operating systems. Different virtualization environments may include different operating systems 608, and/or include different versions of operating systems 608 (e.g., WINDOWS XP® and WINDOWS® 7.0 operating systems). Further, different virtualization environments may include different applied patches and upgrades.
The virtual machine 610 may include any number of virtual machines configured to generate one or more virtualization environments to process the objects 602. The hypervisor 612, kernel, or virtual machine manager, manages resources for the virtualizations and may allow multiple operating systems (e.g., guests) to run concurrently on the host computer. The hypervisor may manage execution of the guest operating systems.
The manager 614 is configured to manage monitoring and control the virtualization environment 600. In various embodiments, the control module 310 controls the virtualization environment 600, including the provisioning, time acceleration, and logging through the manager 614.
The dynamic state manager 616 (i.e., DSM) tracks and logs the state of the machine. The DSM may also store the state for later use within the same or different virtualization environments (e.g., for dynamic state modification). The state may include, for example, the object or object identifier, resources available, time slices when events occurred, and logged events. The DSM 616 may also comprise contents in memory, and locations of contents in memory over time.
The page table manager 618 may receive one or more page tables from the emulation environment. In various embodiments, the object may be tested within both the virtualization environment and the emulation environment. Upon detection of a divergence of operations between the operations of the virtualization environment and the operations of the emulation environment, the emulation module 308 may log the state of the emulation environment and pass the state information to the virtualization environment 600 as a page table for dynamic state modification of the virtualization environment. In some embodiments, the virtualization module 306 re-instantiates the original virtualization environment (e.g., instantiates a modified image of the virtualization environment) and dynamically modifies the state of the virtualization environment using the page table(s) from the emulation environment or the virtualization module 306 may instantiate a new virtualization environment and load the information from the page table.
FIG. 7 is a flow diagram of an exemplary malware detection method. In step 702, an object is intercepted by a data collector. The data collector may be placed on any digital device and/or network device. In step 704, the resource module 504 inspects what resources the object may require for processing (e.g., dynamic libraries and/or registries the object may affect). In some embodiments, the collector includes metadata including where the object came from, where the object was to be received, and/or what application created the request. The resource module 504 may perform preprocessing by determining what resources are required based on the metadata.
In step 706, the virtual machine module 502 instantiates a first instance of a virtualization environment with one or more resources identified by the resource module 504. In one example, the virtual machine module 502 selects and initiates plug-ins within the virtualization environment for memory allocation, forensics, mutex, filesystem, monitoring, taint analysis, and the like. In step 708, the object is executed and/or processed within the virtualization environment.
In step 710, the taint module 508 taints operations of the object within the virtualization environment. The taint module 508 may be a plug-in. In some embodiments, the taint module 508 taints the object, bit by bit, with trace capture information. In step 712, as data propagates through the application, the monitor module 506 monitors the operations assessing what resources were previously allocated and what resources are actually allocated and called within the virtualization environment.
Resources that are required and/or called by the object which were not initially provisioned may be assessed as further evidence of malware. In some embodiments, sets of newly requested resources may be assessed to determine the likelihood of malware. For example, a particular set of resources may be determined to be malicious. If an object calls that particular set of resources (e.g., by calling resources that have not been initially provisioned, calling resources that were initially provisioned, or calling a combination of resources of which only a few were initially provisioned), the object may be determined to be malicious.
In step 714, the monitor module 506 may identify untrusted actions from monitored operations. The monitor module 506 may be a plug-in. In various embodiments, the virtual machine module 502 may load only those resources called by the resource module 504 within the virtualization environment. If the object calls a driver that is not originally provided in the virtualization environment (e.g., the object went outside of the original boundaries or the initially accepted criteria), the object's operations may terminate. In some embodiments, the virtualization environment is re-instantiated or a new virtualization environment may be instantiated that includes the additionally called resource to further process and monitor the operations of the object.
In some embodiments, the object runs in a plurality of virtualization environments until all operations called on by or for the object are completed. The control module 310 may compare the operations performed by or for the object in one virtualization to actions performed in another virtualization to analyze for divergence. If the actions taken were similar between the two virtualization environments, then no divergence was found. If the actions taken were different, divergence is found and the differences may be further assessed (e.g., found untrusted actions taken when an unpatched operating system was present).
Divergence may be evidence of malware. For example, if the object ceases to perform any operations at time T in one virtualization environment but continues to perform many additional operations after time T in another virtualization environment (e.g., use of different resources, point to different points in memory, open a socket, or open up output ports), the difference in the environment (e.g., an available exploit) likely influenced the actions of the object and, as such, vulnerabilities may be identified.
In some embodiments, the operations taken for or by the object within the virtualization environment may be measured to determine a threat value. The threat value may be compared to a customizable threshold to determine if the behavior of the object is untrustworthy. In some embodiments, the threat value is determined based on X values and Y values. The X values may include those operations taken by a plug-in while the Y value correlates to the plug-in and the virtualization environment (e.g., operating system, kernel, or hypervisor). These two values may be part of a function to determine the threat value of each operation by or for the object, an entire execution path of the object, or a part of the execution path of the object. In one example, operations taken by or for an object may be weighted based on a matrix of actions regarding an operation system, application, network environment, or object. The threat value may be compared to a threat threshold to determine if the effect of the object within the virtualization environment is sufficiently trustworthy or if the object is behaving in a manner that is sufficiently suspicious to warrant running the object through the emulation environment. Further, the threat value may be compared to the threat threshold to determine that the operations are such that they may be characterized as untrusted and, therefore, the object may be quarantined and further corrective action may be taken.
In various embodiments, the threat value associated with one or more objects may be increased (e.g., determined to be more threatening and, therefore, indicative of an increased likelihood of maliciousness) based on the resources called by the object. As discussed herein, for example, a particular set of resources may be determined to be malicious. If an object calls that particular set of resources, a threat value associated with the object may signify a significantly increased likelihood of maliciousness.
In step 716, the reporting module 312 generates a report identifying operations and untrusted actions of the object. The reporting module 312 may generate a report identifying the object, the payload, the vulnerability, the object of the attack, recommendations for future security, and so on.
Those skilled in the art will appreciate that using signatures to identify suspicious data or malware may be optional. For example, suspicious data may be provided to the virtualization environment. If the suspicious data behaves in a manner similar to known malware, a class of malware, or a class of data with suspicious behavior, then the object may be quarantined and remedial action taken (e.g., the user of the target digital device may be notified). In some embodiments, the process of testing the suspicious data within a virtualization environment to determine a potential threat may be faster than utilizing signatures in the prior art.
FIG. 8 is a flow diagram of an exemplary method of controlling a virtualization environment to detect malware. In step 802, the state module 512 may log a first instance of the virtualization environment. For example, the state module 512 may log or track the state of the virtualization environment (e.g., time, memory values, location of data within memory, and/or ports called). The state module 512 may log the state of a plurality of virtualization environments operating in parallel.
In step 804, the virtual machine module 502 may halt the first instance of the virtualization environment. For example, the object may have terminated functions after requesting a resource not originally provided in the first instance of the virtualization environment. In some embodiments, the request for a resource not originally provisioned is evidence of malware (e.g., requesting access to a resource that the object should not have reason to access). In various embodiments, the virtual machine module 502 may permit the first instance of the virtualization environment to continue running and the virtual machine module 502 may instantiate a new instance of the virtualization environment.
In step 806, the resource module 504 determines additional resources for the object. For example, if the object requests a resource not originally provided in the first instance of the virtualization environment, the resource module 504 may identify the desired additional resource. In various embodiments, if a divergence is also detected with another virtualization environment, the resource module 504 may also identify differences in resources between the first and other virtualization environments.
In step 808, the virtual machine module 502 re-instantiates the first instance of the virtualization environment including the previously identified resources at the previously logged state. As a result, the object may be presented with an environment that may appear to be unprotected. Further, in step 810, the time module 510 may accelerate the clock signal to the time the object requested the unavailable resource.
In step 812, the monitor module 506 may monitor operations by or for the object within the re-instantiated virtualization environment. In some embodiments, the monitor module 506 monitors the operations by or for the object as if the virtualization environment had not changed. In some embodiments, a plug-in monitors the operations by or for the object and provides information to the monitor module 506. In step 814, the monitor module 506 may identify untrusted actions from monitored operations. As discussed herein, the operations, either taken alone or in combination, may be used to determine a threat value. The threat value may be compared to a threat threshold to determine if the object is behaving suspiciously, not behaving suspiciously, or behaving in an untrustworthy manner.
In step 816, the reporting module 312 may generate a report identifying suspicious or untrusted operations as well as any untrusted actions (e.g., vulnerability exploits, target of payload, defenses of the object and so on).
Those skilled in the art will appreciate that the first instance of the virtualization environment may not be halted. In some embodiments, a new instance of the virtualization environment is instantiated (without halting the previous instance) including the state information and the like. In various embodiments, the first instance of the virtualization environment is halted and then re-instantiated including the state information.
FIG. 9 is a flow diagram of an exemplary model to detect malware through multiple virtualization environments. In step 902, the collection module 302 collects the object and the resource module 504 determines one or more required resources.
In step 904, the virtual machine module 502 may instantiate the first instance of the virtualization environment with the determined resources. Further, in step 906, the virtual machine module 502 may instantiate a second instance of the virtualization environment but with resources that are different from that provided in the first instance of the virtualization environment. For example, versions of applications may be different, operating system patches, may be different, or the like.
In step 908, the virtual machine module 502 executes the object within the first and second instances of the virtualization environment. In step 910, the monitor module 506 may monitor operations of the object within the first and second virtualization environments. In various embodiments, the monitor module 506 traces the operations of the object in both virtualization environments. As discussed herein, a trace may be based on X values (e.g., operations by or on a plug-in of the virtualization environment) and Y values (e.g., operations between an operating system of the plug-in which may be coordinated with the X values). In some embodiments, not all operations are relevant. In some embodiments, one or more actions or operations by the host during processing may be compared against a check system to determine if the action or operation is relevant. If the action or operation is relevant, then the action or operation may be given weight and may affect the trace. In various embodiments, the one or more actions or operations by the host during processing may be compared against a check system to determine if the action or operation is not relevant. If the action or operation is not relevant, then the action or operation may be given no weight and may not affect the trace.
In step 912, the control module 310 or the monitor module 506 compares the operations of the first instance and the operations of the second instance to determine divergence. In one example, the traces of the object in the respective virtualization environments may form an execution tree which may be compared to other execution trees associated with other virtualization environments.
In one example, divergence between the traces of the two virtualization environment may be found. In various embodiments, the control module 310 may halt one or both of the virtualization environments and may notify an administrator of malware. In some embodiments, the control module 310 continues processing the object within one or both virtualization environments to further identify characteristics of the suspicious data, targeted vulnerabilities, payload, goal, or the like.
In step 914, the reporting module 312 generates a report identifying operations suspicious behavior, and/or untrusted actions of the object based, in part, on the comparison. For example, the reporting module 312 may identify the exploit that is present in some digital devices but not others. Further, the report may include recommendations to improve security (e.g., moving valuable information to a more secure location).
FIG. 10 is a block diagram of an exemplary digital device 1000. The digital device 1000 comprises a processor 1002, a memory system 1004, a storage system 1006, a communication network interface 1008, an I/O interface 1010, and a display interface 1012 communicatively coupled to a bus 1014. The processor 1002 is configured to execute executable instructions (e.g., programs). In some embodiments, the processor 1002 comprises circuitry or any processor capable of processing the executable instructions.
The memory system 1004 is any memory configured to store data. Some examples of the memory system 1004 are storage devices, such as RAM or ROM. The memory system 1004 can comprise the RAM cache. In various embodiments, data is stored within the memory system 1004. The data within the memory system 1004 may be cleared or ultimately transferred to the storage system 1006.
The storage system 1006 is any storage configured to retrieve and store data. Some examples of the storage system 1006 are flash drives, hard drives, optical drives, and/or magnetic tape. In some embodiments, the digital device 1000 includes a memory system 1004 in the form of RAM and a storage system 1006 in the form of flash data. Both the memory system 1004 and the storage system 1006 comprise computer readable media which may store instructions or programs that are executable by a computer processor including the processor 1002.
The communication network interface (com. network interface) 1008 can be coupled to a network (e.g., communication network 114) via the link 1016. The communication network interface 1008 may support communication over an Ethernet connection, a serial connection, a parallel connection, or an ATA connection, for example. The communication network interface 1008 may also support wireless communication (e.g., communication using the 802.11 a/b/g/n standard or the WIMAX® standard). It will be apparent to those skilled in the art that the communication network interface 1008 can support many wired and wireless standards.
The optional input/output (I/O) interface 1010 is any device that receives input from the user and output data. The optional display interface 1012 is any device that is configured to output graphics and data to a display. In one example, the display interface 1012 is a graphics adapter. It will be appreciated that not all digital devices 1000 comprise either the I/O interface 1010 or the display interface 1012.
It will be appreciated by those skilled in the art that the hardware elements of the digital device 1000 are not limited to those depicted in FIG. 10. A digital device 1000 may comprise more or fewer hardware elements than those depicted. Further, hardware elements may share functionality and still be within various embodiments described herein. In one example, encoding and/or decoding may be performed by the processor 1002 and/or a co-processor located on a GPU (e.g., an NVIDIA® GPU).
FIG. 11 is a conceptual block diagram of an emulation environment 1100 in some embodiments. The emulation environment 1100 may be instrumented and allow the object direct access to memory. As a result, malware that searches for evidence of virtualization or evidence of a security program may conclude that a target machine is sufficiently unprotected and, as such, may engage malicious behavior without early termination.
The emulation environment 1100 comprises the process 1102 being tested, the hypervisor 1104, the host 1106, and the hardware 1108. The process 1102 may comprise the functions of or for an object received from the virtualization module 306.
The hypervisor 1104 may provision the emulation environment 1100 and synchronize operations between one or more virtualization environments and the emulation environment 1100. In some embodiments, the hypervisor 1104 initially provisions the emulation environment based on metadata associated with the data to be assessed and/or resources identified within the virtualization environment(s).
In some embodiments, the hypervisor 1104 may be an emulation manager configured to control the emulation environment. In one example, the hypervisor 1104 may redirect commands between the process 1102, host 1106 and/or hardware 1108. In various embodiments, the hypervisor 1104 may receive trace information from a trace capture plug-in within the emulation environment 1100 to trace behavior of the object (e.g., commands from and/or responses to the object in the emulation environment 1100). In various embodiments, the hypervisor 1104 is a kernel.
The host 1106 comprises the host system (e.g., operating system), support applications, and other data at the O/S layer. The hardware 1108 includes the drivers and hardware interfaces at the hardware layer.
In various embodiments, the hypervisor 1104 determines trace values to compare against the trace values of the virtualization environment. Since the emulation environment is not a virtualization environment, the object may behave in a different manner and, as such, the trace values between the emulation environment and the virtualization environment may be different. In various embodiments, a control module 310 may perform divergence analysis by comparing the trace values from the virtualization module 306 and the emulation module 308. If the values are different, the control module 310 may halt the virtualization environment to control the virtualization and include one or more responses recorded in the emulation environment which is further discussed herein.
Those skilled in the art will appreciate that there may be any number of emulation environments. For example, there may be multiple emulation environments operating on one or more digital devices.
As discussed herein, the emulation module 308 may generate trace values to compare with tracing of the object in the virtualization environment to detect divergence. In some embodiments, the events associated with the object may be evaluated based on a time value X′, a sequence value Y′ and a process value Z′. These values may be compared to values of the virtualization environment to identify divergence. Divergence detection is further discussed herein.
FIG. 12 is a block diagram of an exemplary emulation module 308 in some embodiments. The emulation module 308 comprises an emulation engine 1202, a plug-in module 1204, a trace capture module 1206, a recording module 1208, a manager module 1210, a time module 1214, and a hierarchical reasoning engine (HRE) 1216.
The emulation module 308 implements an emulation environment and may be instrumented (e.g., via plug-ins that operate with and/or within the emulation environment). The emulation module 308 may allow an object direct memory access. The emulation module 308 may instantiate any number of emulation environments. In one example, the emulation module 308 operates three different emulation environments in parallel.
The plug-in module 1204 is configured to load one or more plug-ins into the emulation environment to process the object. The plug-ins may include application-layer information (e.g., Adobe, shared drivers, and mutex), network information (e.g., available port), and the like. In various embodiments, the plug-in module 1204 does not comprise plug-ins for security or tracking operations which may be detected by the object. There may be any number of plug-ins for a given emulation environment. In some embodiments, there are eight initial plug-ins in the emulation environment.
In some embodiments, the resource module 504 of the virtualization module 306 provides a list of required resources and/or metadata to the plug-in module 1204. The plug-in module 1204 may provision the emulation environment based in part on the information received from the resource module 504, the object, and/or metadata associated with the object. In some embodiments, the resource module 504 is a hypervisor.
The trace capture module 1206 is configured to track an execution path for the object. For example, the trace capture module 1206 may trace actions taken for and by the object in the emulation module 308. In various embodiments, the trace capture module 1206 may be within the hypervisor layer, be a plug-in, or be a combination of both. As a result, the trace capture module 1206 and/or the functions of the trace capture module 1206 may be invisible to the object.
In various embodiments, the trace capture module 1206 traces the operations of and for the object in the emulation environment. As discussed herein, the trace capture module 1206 may generate a trace for the object in the emulation environment based on actions of the plug-ins (e.g., an X trace) and actions taken that correlate between the emulation environment and the plug-ins (e.g., a Y trace). This trace capture process may be similarly taken in the virtualization environment where the X trace may be associated with actions of the plug-ins of the virtualization environment and the Y trace may be associated with actions that correlate between the virtualization environment and the plug-ins. The manager module 1210 may compare the trace for the object in the emulation environment to a trace for the object in the virtualization environment to detect divergence.
In various embodiments, the trace for the object in the emulation environment and/or virtualization environment may be filtered such that all actions taken in the emulation environment and/or virtualization environment are not necessary to generate the trace. For example, all of the actions taken by the host system during processing of the object may not be relevant to trace. In some embodiments, the trace capture module 1206 may generate a trace based on relevant actions or operations. In one example, the trace capture module 1206 filters the actions and operations of the host and/or one or more plug-ins during processing of the object. In some embodiments, one or more actions or operations by the host during processing may be compared against a check system to determine if the action or operation is relevant. If the action or operation is relevant, then the action or operation may be given weight and may affect the trace. If the action or operation is not relevant, then the action or operation may not be considered when developing the trace. Those skilled in the art will appreciate that similar filtering may occur in determining the trace in the virtualization environment.
The recording module 1208 may record operations by or for the object in the emulation environment. In some embodiments, the recording module 1208 records responses to the object within the emulation environment (e.g., responses from the host). The recording module 1208 may also track the state of the emulation environment (e.g., what is stored in memory, where in memory is data stored, and/or time of operation(s)). Those skilled in the art will appreciate that the recording module 1208 may record any kind of information including information from the object, information for the object, or information generated on behalf of the object.
The time module 1214 may record time of events, record time states of actions or operations in the emulation environment, and may accelerate time (e.g., the clock signal) within the emulation environment to detect changes in the behavior in the object. For example, some malware is configured to wait a predetermined period of time before acting maliciously. In some embodiments, the time module 1214 may accelerate one or more clock signals in the emulation environment such that the object is given a period of time to trigger untrusted behavior.
When divergence is detected based on a comparison of traces of the virtualization environment and the emulation environment, the virtualization module 306 may re-instantiate the virtualization environment. In various embodiments, the state of the emulation environment, including the resources, data in memory, locations of data in memory, clock signal, and/or the like may be loaded into the virtualization environment upon re-instantiation. The virtualization environment may begin to process the object at the time (or the time preceding) the divergence between the virtualization environment and the emulation environment. The object may be given at least part of the recorded information from the emulation environment. The recorded information may include all or part of a response to the object at the time divergence was detected. Those skilled in the art will appreciate that the object may receive the response and act within the virtualization environment as if the object had been received by the target system and a proper response was received. Subsequently, the virtualization module 306 may continue to trace the behavior of the object within the virtualization environment. The new trace may also be compared to the trace of the emulation environment to determine if a divergence is found. If there is no divergence, the virtualization environment may continue to process the suspicious data to look for untrusted behavior.
In some embodiments, the virtualization module 306 and the emulation module 308 may operate the virtualization environment and emulation environment in parallel. For example, after an object is identified as behaving suspiciously, then an emulation environment may be instantiated for processing the object. When a divergence between the virtualization environment and the emulation environment is detected, the virtualization environment may be re-instantiated (e.g., the instance of the virtualization environment may be restarted or a new virtualization environment may be instantiated) with some of the recorded information from the emulation environment. The virtualization environment and emulation environment may continue to process the object in parallel. If a new divergence is detected between the two environments, the emulation environment and/or the virtualization environment may halt and the virtualization environment re-instantiated with new recorded information and/or new state information from the emulation environment. The process may continue until the processing of the object is completed.
In various embodiments, once the virtualization environment is re-instantiated, the emulation environment may be halted or terminated until or unless the object is determined to be behaving in an untrusted manner.
The HRE 1216 may be configured to determine maliciousness or provide information that may increase a threat value associated with the object being processed. In some embodiments, the HRE 1216 assesses the behavior of the data being processed. In one example, the HRE 1216 assesses requests for resources that deviate from the initially identified resources, assesses deviations between the virtualization environment and emulation environment, and assesses the significance of a series of actions. If combinations of resources calls and/or actions have been identified as malicious, then the HRE 1216 may flag the data as malicious or increase one or more threat values.
In various embodiments, sets of resources, sets of requested resources not initially provisioned, and/or sets of actions performed by a suspicious object may be associated with malicious behavior. In one example, sets of resource and/or actions may be identified as malicious or have an increased likelihood of being malicious. The HRE 1216 may review the activities of an object within the virtualization environment and/or emulation environment to determine if a set of requested resources and/or actions are similar to known malicious sets of resources and/or actions. If the HRE 1216 identifies a set of requested resources and/or actions as being malicious, the HRE 1216 may provide threat information that maybe heavily weighted in determining the risk of the object (e.g., reduce a value of trustworthiness associated with the object).
FIG. 13 is a flow diagram of an exemplary malware detection method utilizing an emulation environment in some embodiments. In step 1302, an object is intercepted by a data collector. In various embodiments, the data collector, an agent associated with the data collector, and/or a security server 108 may test the object to determine if the object is suspicious. For example, the security server 108 may compare a hash of the object to a whitelist, compare a hash of the object to a blacklist, apply heuristic analysis, or apply statistical analysis. The security server 108 may also apply one or more rules to determine if the object is suspicious. For example, the security server 108 may have rules that flag an object as suspicious if the object came from an untrusted source, was being sent to an critical destination (e.g., the digital device that is to receive the object contains credit card numbers, health records, trade secret information, or other sensitive information), or was otherwise sent in an atypical manner (e.g., the sending digital device does not normally send objects to the destination digital device).
In step 1304, the virtual machine module 502 instantiates a first instance of a virtualization environment with the resources identified by the resource module 504. In one example, the virtual machine module 502 sets up plug-ins within the virtualization environment. In some embodiments, the resource module 504 inspects what resources the object needs (e.g., dynamic libraries and/or registries the object may affect). In step 1306, the object is executed and/or processed within the virtualization environment. In various embodiments, the taint module 508 taints operations of the object within the virtualization environment. In some embodiments, the taint module 508 taints the object, bit by bit, with trace capture information.
In step 1308, the virtual machine module 502 traces operations of the object during processing within the virtualization environment. In various embodiments, operations of, for, or provided as a response to the object may be used to generate one or more traces associated with the object in the virtualization environment. Similar to operations of the emulation environment trace capture module 1206, in some embodiments, the virtual machine module 502 may generate one or more traces based on actions of the plug-ins (e.g., the X trace) and actions taken that correlate between the virtualization environment and the plug-ins (e.g., the Y trace). In various embodiments, the actions identified for the trace generated by the virtual machine module 502 may also be filtered in a manner as described regarding filtering actions associated with the emulation environment.
In step 1310, the virtualization module 306 detects suspicious behavior of the object. For example, an object may be flagged as having suspicious behavior if the object executes a number of tasks and then abruptly terminates, executes one or more tasks that appear to have no relevant effect, checks a location in memory which should be empty, scans memory locations for no apparent purpose, hashes communication between the object and the host (e.g., to compare with a predetermined hash to identify a pattern of communication that is consistent with virtualization or security program interference), or the like.
In step 1312, the emulation module 308 instantiates an emulation environment. In various embodiments, the control module 310 determines when behavior of the object is suspicious and controls the instantiation of the emulation environment. One or more actions for or by the object within the virtualization environment may be used to determine a trustworthiness value. In one example, actions taken by or for an object may be weighted based on a matrix of actions regarding an operation system, application, network environment, or object. A trustworthiness value associated with one or more actions may be compared to a trustworthiness threshold to determine if the effect of the object within the virtualization environment is sufficiently trustworthy or if the object is behaving sufficiently suspicious to warrant running the object through the emulation environment. A user (e.g., administrator or security personnel) may set the threshold depending on the acceptable level of risk, available digital device resources (e.g., processors, speed of storage, available memory), and/or optimization.
In step 1312, the emulation module 308 instantiates the emulation environment. In various embodiments, the plug-in module 1204 instantiates the emulation environment with a standard set of plug-ins and/or other resources. In some embodiments, the plug-in module 1204 receives resource information from the resource module 504, the object, and/or metadata associated with the object. The plug-in module 1204 may configure the emulation environment (e.g., include one or more resources within the emulation environment) based, at least in part, on the resource information.
In step 1314, the emulation module 308 processes the object within the emulation environment. In step 1316, the trace capture module 1206 traces operations for or by the object during processing within the emulation environment. As discussed herein, the trace of the object within the emulation environment may be based on actions of the plug-ins and/or actions that correlate between the host, plug-ins and/or object. Those skilled in the art will appreciate that the trace of the object within the emulation environment and the trace of the object within the virtualization environment may be performed in any way. In some embodiments, the trace is generated differently (e.g., based on different action and/or different filtering) between the emulation environment and the virtualization environment.
In various embodiments, the recording module 1208 of the emulation module 308 may record information within the emulation environment. The recording module 1208 may record any operation, resource, or operation of the emulation environment. In various embodiments, the recording module 1208 records responses provided from the plug-ins, the host, and/or the hardware to the object.
In step 1318, the manager module 1210 may compare one or more traces of the virtualization environment to one or more traces of the emulation environment to detect divergence. In one example, the object may scan memory within the virtualization environment and identify file and/or remnants of running the virtualization. As a result, the object may “go benign” and not perform any malicious behavior in the virtualization environment. A copy of the object within the emulation environment may scan the memory and not find any files or remnants of a virtualization and, as a result, execute a malicious payload. The manager module 1210 may identify divergence between the two environments.
In various embodiments, when divergence is detected, the virtualization environment is re-instantiated or a new virtualization environment is instantiated at the point of divergence. For example, the time of divergence may be identified and the logged state of the emulation environment at that time of divergence may be provisioned within the virtualization environment. As a result, the virtualization module 306 may store similar data in memory at memory locations that are similar to the data stored in memory of the emulation environment at the time of divergence. Similarly, the clock signal within the virtualization environment may be accelerated such that the relevant conditions at the point of divergence may be similar between the two environments. The object within the virtualization environment may be presented with the recorded response from the emulation environment, and the operations of the object may continue to be monitored in the virtualization environment.
In step 1320, the virtualization module 306 re-instantiates the virtualization environment with recorded information from the emulation environment. In various embodiments, the virtualization module 306 re-instantiates the virtualization environment by halting processing within the virtualization environment and restarting the virtualization environment. In some embodiments, the virtualization module 306 re-instantiates the virtualization environment by instantiating a new virtualization environment.
The newly instantiated virtualization environment may be loaded with one or more states and information from the emulation environment. In one example, the emulation module 308 provides a page table (or information which may populate a page table) reflecting state and/or information that reflect operations within the environment. The virtualization module 306 may instantiate the virtualization environment with all or some of the page table information.
The virtualization module 306 may also provide all or some of the recorded information from the emulation environment to the object in the newly instantiated virtualization environment. For example, the virtualization module 306 may construct the newly instantiated virtualization environment to include all resources, data, memory, clock signals, and activities up to the point of divergence such that it may appear to the object that the object has been processed from the beginning and the object has received information (e.g., the recorded information) needed to proceed to a next step. In some embodiments, the recorded information and state of the newly instantiated virtualization environment allows the object to continue functioning (e.g., the object executes a malicious payload or the object continues examination looking for security programs or virtualization before executing the payload).
In step 1322, the virtualization module 306 continues to monitor the operations by and for the object to identify suspicious behavior in the re-instantiated virtualization environment. Similar, to step 1310, the virtualization module 306 may detect suspicious behavior in any number of ways. If the object continues to behave suspiciously, the control module 310 may instantiate a new emulation environment and may optionally load resources or plug-ins based on information from the resource module 504. Alternatively, the emulation module 308 may continue monitoring with the existing emulation environment. In some embodiments, the object continues processing in the emulation environment regardless if the object behaves suspiciously in the newly re-instantiated virtualization environment.
In step 1324, the virtualization module 306 monitors the operations by and for the object to identify untrusted actions in the re-instantiated virtualization environment. In various embodiments, one or more actions of or by the object may be characterized as untrusted based on a trustworthiness value. The trustworthiness value may be compared to a trustworthiness threshold to determine if actions taken by or for the object are untrustworthy (e.g., the object is considered to be malware). In one example, actions taken by or for an object may be weighted based on a matrix of actions regarding an operation system, application, network environment, or object. In some embodiments, a user (e.g., administrator or security personnel) may set the threshold depending on the acceptable level of risk.
In step 1326, the reporting module 312 generates a report describing the object and untrusted actions as described herein. The report may be similar to the report generated regarding step 716 in FIG. 7.
FIG. 14 is an exemplary emulation environment 1400 for detection of malware in some embodiments. The emulation environment 1400 may comprise a first domain 1402 running the LINUX® operating system, as well as first domains 1404 and 1406 running the WINDOWS® operating system. Further, the emulation environment 1400 may comprise a standard OS 1408, a taint analysis & data flow tracking 1410, direct memory access 1412, OS forensics plug-ins 1414, dynamic state modification & data flow captures 1416, divergence analysis module 1418, hypervisor component 1420, malware analysis virtual machine manager 1422, and a processor emulator 1424.
The domains 1402-1406 are the domains of the emulation environment 1400. In one example, the native domain may be domain 1402 running the LINUX® operating system, while domains 1404 and 1406 emulate the WINDOWS® operating system. Any domain may be native and any domain may be emulated.
The standard OS 1408 may be any OS (e.g., the LINUX® operating system) which may pass information to the other components of the emulation environment 1400. The standard OS 1408 may be any operating system.
The taint analysis and data flow tracking function 1410 may monitor and perform taint analysis to determine indications of maliciousness. In some embodiments, the taint analysis and data flow tracking function 1410 may receive information regarding tracking functions and tainting from a plug-in (e.g., from the OS forensics plug-ins function 1414).
The dynamic state modification & data flow captures 1416 may determine what resources are needed by an object in the emulation environment 1400. The dynamic state modification & data flow captures 1416 may identify additional resources required by the object, increase or decrease time value within the emulation environment 1400, and/or monitor behavior of the object. For example, the object may request resources that were not originally provisioned by the hypervisor component 1420. The hypervisor component 1420 may provision a new emulation environment 1400, adjust the emulation environment 1400 with the new resources, and/or synchronize requested resources with one or more other virtualization environment(s) and/or emulation environment(s).
The divergence analysis module 1418 may track and compare operations of an object to determine divergence as discussed herein. In various embodiments, the divergence analysis module 1418 may trace operations as depicted in FIGS. 15 and 16 herein.
The hypervisor component 1420 may be configured to synchronize between multiple virtualization environment(s) and emulation environment(s). For example, the hypervisor component 1420 may control initial provisioning within the emulation environment based on resources requested in the virtualization environment, resources originally provisioned within the virtualization environment, resources requested in another emulation environment, or metadata associated with the object to be tested. The hypervisor component 1420 may also provide resources to be provisioned to one or more virtualization environment(s) and compare the operations between or among any number of environments (both virtualization environments and emulation environments).
The malware analysis virtual machine manager 1422 may receive information from the direct memory access 1412 and control the OS forensics plug-ins 1414. The OS forensics plug-ins 1414 may provide information to the dynamic state modification & data flow captures function 1416, the divergence analysis module 1418, and/or the hypervisor component 1420. In various embodiments, the malware analysis virtual machine manager 1422 may also select, initiate, control, and/or deactivate plug-ins.
The processor emulator 1424 is any processor and/or emulator to assist in the emulation process. In one example, the processor emulator 1424 may implement generic machine emulator and virtualizer.
FIG. 15 is a trace diagram 1500 of operations by or for an object in an emulation environment in some embodiments. In various embodiments, the trace capture module 1206 generates a trace of the behavior of the object in the emulation environment. The trace of the object in the emulation environment may be compared to a trace of the object in the virtualization environment to detect divergence. Those skilled in the art will appreciate that the trace of the object in the virtualization environment may be generated based on different factors or in a different way than the trace generated for the object in the emulation environment.
In step 1502, as processes for the object execute, actions taken for or by the object in the emulation environment may travel execution trees. For example, the object may be a file that is executed within the INTERNET EXPLORER® browser. The object may spawn a mail process and an Active X process. In step 1504, nodes and branches generated on the execution tree may vary based on context.
In step 1506, the trace capture module 1206 captures execution paths of the execution trees. For example, the trace capture module 1206 may capture time and events (e.g., operations by or for the object) to determine an execution path. In some embodiments, the execution path may represent functions of the plug-ins of the emulation environment.
In step 1508, the control module 310 may correlate the execution paths of the execution trees between the emulation environment and the virtualization environment for divergence. Further, the execution paths of the execution trees of the virtualization environment and the emulation environment may be mapped against malicious behaviors and threat thresholds to determine the degree of untrustworthiness. For example, one or more operations of or by the object in the virtualization environment and/or the emulation environment may be measured against a predetermined threshold (e.g., based on a frame of reference identifying a degree of risk along different dimensions such as object operations, application operations, operating system operations, and network operations).
Those skilled in the art will appreciate that the threshold and trustworthiness valuation procedure may be customized. Certain risks, depending on the nature of the network, the state of critical information (e.g., encrypted), and the like may influence how an administrator may characterize the threshold and valuation procedure.
FIG. 16 is a block diagram 1600 of divergence detection between a virtualization environment and an emulation environment in some embodiments. Traces may be discrete based on events (e.g., operations) and/or time. Those skilled in the art will appreciate that in the various execution trees depicted in FIG. 16, the execution tree on the left side of the graphs represents an execution tree in the virtualization environment while the execution tree on the right of the graphs represents an execution tree in the emulation environment. Although the term “iteration” is used within FIG. 16, the different graphs 1602-1610 may be understood to be based on time, events, or paths.
In graph 1602, the instantiation of the execution tree at time 0 depicts an initial operation. The initial operation of the execution tree may be an instantiation of an application (e.g., INTERNET EXPLORER® browser v. 5.0) or resource call by or for the object.
The spot in graph 1602 is identified with a threat value to indicate whether the operation is trustworthy or untrustworthy. As discussed herein, one or more executions of the execution tree may be measured (e.g., characterized as a threat level) to determine a degree of threat or maliciousness (e.g., a trustworthiness value). The measure (i.e., threat or trustworthiness value) may be compared against a threat or trustworthiness threshold to determine whether the action represents an untrusted action (e.g., malicious behavior). In some embodiments, all executions, paths, and nodes of the execution path may be measured to determine the degree of threat or maliciousness. Those skilled in the art will appreciate that each individual step (e.g., the deltas between the graphs) may be measured to determine a degree of threat of each step. Further, the entire execution tree at various points in time may also be measured to determine a degree of threat of the steps in combination.
In graph 1604, the first path (e.g., at time T+1 where time T begins at the 1st Iteration 1602) indicates that the execution path of the object within the virtualization environment and the execution path of the object within the emulation environment are similar. In one example, the object may load or make a call to Active X. In graph 1606, the second path (e.g., at time T+2) at the nth iteration indicates that the next execution path of the object within the virtualization environment and the execution path of the object within the emulation environment remain similar.
In graph 1608, the third path (e.g., at time T+3), indicates that there is a divergence of the execution path of the object in the virtualization environment when compared to the execution path of the object in the emulation environment. In one example, the object in the virtualization environment may have detected virtualization, a missing resource, or evidence of a security application and started behaving in a different manner. Once the divergence is detected, the control module 310 may re-instantiate the virtualization environment. In some embodiments, the control module 310 may load the state of the previous virtualization environment or the state of the emulation environment in the newly re-instantiated virtualization environment. In some embodiments, the state may include information of the virtualization environment and/or the emulation environment immediately before or at the time of divergence.
In some embodiments, the original virtualization environment and the emulation environment may continue without termination to further assess the execution path of the object in both environments. In one example, even though the path originally diverged, the object may continue to operate in a similar manner or to perform slightly different actions.
Those skilled in the art will appreciate that although graph 1610 is identified as “final,” there may be any number of paths over time before termination of the execution path (e.g., either by the object, the virtualization module 306, emulation module 308, or the control module 310).
Further, those skilled in the art will appreciate that the control module 310 may determine divergence of the execution trees between the object in the virtualization environment and the emulation environment as the steps in one or the other environments occur or once processing within one or both environments terminates.
The emulation module 308 may begin processing the object at any point in time. For example, the virtualization module 306 may track when suspicious behavior occurred. The emulation module 308 may be configured to provision the emulation environment with at least some of the resources of the virtualization module 306 including the states of the virtualization environment at the time of suspicious behavior. The emulation module 308 may then begin processing the object immediately before or at the time of suspicious behavior.
FIG. 17 is an exemplary process 1700 for a hierarchical reasoning engine (HRE) in some embodiments. In various embodiments, the HRE extracts significant instructions from a series of actions of an object under assessment. In one example, the HRE identifies sets of actions (e.g., resource requests and/or operations) that may be associated with malicious activities. The set of actions may be in any order or may, in part, depend upon the order of instructions.
The HRE may identify significant patterns based upon the set of instructions. For example, the HRE may compare sets or subsets of instructions against a table or other data structure that contains sets or subsets of instructions that indicate maliciousness. In some embodiments, the HRE may compare sets or subsets of instructions against a table or other data structure that contains sets or subsets of instructions that indicate trustworthiness.
Those skilled in the art will appreciate that the HRE may calculate a likelihood for a pattern indicative of trust or maliciousness may occur. In various embodiments, the HRE may take many different types of information (e.g., statistics, heuristics, metadata, and the like) into account to determine a likelihood. In various embodiments, the HRE may determine a value or bias a threat value based on the likelihood that the set of actions is malicious.
For example, the HRE may monitor operations of an object within an emulation environment. The HRE may track the actions of the object and compare a set of the object's actions to known malicious sets of actions. If the object's actions match a set of actions that are known to be malicious, then the HRE may flag the object as malicious, and update a threat index to indicate an increased likelihood of maliciousness. In some embodiments, one or more emulation environment(s) and/or virtualization environment(s) (e.g., all or a subset of environments) may be terminated upon identification of a malicious sets of actions. In various embodiments, the objects may continue to be assessed in one or more virtualization environment(s) and/or emulation environment(s) for more information.
The HRE may provide another level of information that may identify likelihood of maliciousness and further provide better information regarding the object's risk or trustworthiness. As a result, a user may set a preference for an acceptable level of risk to accept or reject data based on the level of trustworthiness calculated by systems and methods described herein.
Those skilled in the art will appreciate that the computation efficiency may increase by reducing the number of nodes needed by tracking actions of objects and comparing those actions to known behaviors. Further, similarly behaving objects may be more readily observed and classified accordingly, thereby increasing overall efficiency and accuracy.
The above-described functions and components can be comprised of instructions that are stored on a storage medium such as a computer readable medium. The instructions can be retrieved and executed by a processor. Some examples of instructions are software, program code, and firmware. Some examples of storage medium are memory devices, tape, disks, integrated circuits, and servers. The instructions are operational when executed by the processor to direct the processor to operate in accord with embodiments of the present invention. Those skilled in the art are familiar with instructions, processor(s), and storage medium.
The present invention is described above with reference to exemplary embodiments. It will be apparent to those skilled in the art that various modifications may be made and other embodiments can be used without departing from the broader scope of the present invention. Therefore, these and other variations upon the exemplary embodiments are intended to be covered by the present invention.

Claims (21)

The invention claimed is:
1. A method comprising:
in response to a data collector on a first network determining that an object transmitted from a first digital device to a second digital device and intercepted by the data collector is suspicious, receiving at a second network the object from the data collector on the first network; and
at the second network:
instantiating a set of virtualization environments, each virtualization environment including a different set of one or more resources based on metadata received from the data collector;
processing the object within the set of virtualization environments;
tracing operations of the object while processing within each of the virtualization environments of the set of virtualization environments to generate a first set of traced operations;
detecting suspicious behavior associated with the object in at least one of the virtualization environments of the set of virtualization environments;
instantiating an emulation environment in response to the detected suspicious behavior in the at least one of the virtualization environments of the set of virtualization environments;
processing the object within the emulation environment;
recording responses to the object within the emulation environment to generate recorded responses;
tracing operations of the object while processing within the emulation environment to generate a second set of traced operations;
determining a likelihood of maliciousness based on at least one of the suspicious behavior associated with the object in the at least one of the virtualization environments, the recorded responses, the first set of traced operations, and the second set of traced operations; and
if the determined likelihood of maliciousness is greater than a threshold, generating a report regarding the object, the report including at least one of the recorded responses, the first set of traced operations, and the second set of traced operations.
2. The method of claim 1, wherein the suspicious behavior comprises the object loading data into memory within the at least one of the virtualization environments but not using the data within the at least one of the virtualization environments.
3. The method of claim 1, wherein the suspicious behavior comprises the object scanning locations in memory of the at least one of the virtualization environments and then terminating operations.
4. The method of claim 1, wherein the suspicious behavior comprises the object halting operations.
5. The method of claim 1, wherein trace capturing is performed in a kernel of a digital device hosting the emulation environment.
6. The method of claim 1, further comprising increasing or decreasing a frequency of a clock signal within the emulation environment.
7. The method of claim 1, further comprising re-instantiating the at least one of the virtualization environments in response to a detected divergence and instantiating a modified image of the at least one of the virtualization environments.
8. The method of claim 7, further comprising applying state information from the emulation environment to the at least one of the re-instantiated virtualization environments.
9. The method of claim 7, wherein re-instantiating the at least one of the virtualization environments in response to the detected divergence comprises halting the at least one of the virtualization environments and restarting the at least one of the virtualization environments.
10. The method of claim 7, wherein the at least one of the virtualization environments is re-instantiated at a state when the divergence is detected between the at least one of the virtualization environments and the emulation environment.
11. A system comprising:
memory;
one or more processors; and
one or more modules stored in the memory and configured for execution by the one or more processors, the modules comprising:
instructions to receive at a second network an object intercepted by a data collector on a first network in response to the data collector on the first network determining that an object transmitted from a first digital device to a second digital device is suspicious;
instructions to instantiate a set of virtualization environments, each virtualization environment including a different set of one or more resources based on metadata received from the data collector;
instructions to process the object within the set of virtualization environments;
instructions to trace operations of the object while processing within each of the virtualization environments of the set of virtualization environments to generate a first set of traced operations;
instructions to detect suspicious behavior associated with the object in at least one of the virtualization environments of the set of virtualization environments;
instructions to instantiate an emulation environment in response to the detected suspicious behavior in the at least one of the virtualization environments of the set of virtualization environments;
instructions to process the object within the emulation environment;
instructions to record responses to the object within the emulation environment to generate recorded responses;
instructions to trace operations of the object while processing within the emulation environment to generate a second set of traced operations;
instructions to determine a likelihood of maliciousness based on at least one of the suspicious behavior associated with the object in the at least one of the virtualization environments, the recorded responses, the first set of traced operations, and the second set of traced operations; and
instructions to generate a report regarding the object, if the determined likelihood of maliciousness is greater than a threshold, the report including at least one of the recorded responses, the first set of traced operations, and the second set of traced operations.
12. The system of claim 11, wherein the suspicious behavior comprises the object loading data into memory within the at least one of the virtualization environments but not using the data within the at least one of the virtualization environments.
13. The system of claim 11, wherein the suspicious behavior comprises the object scanning locations in memory of the at least one of the virtualization environments and then terminating operations.
14. The system of claim 11, wherein the suspicious behavior comprises the object halting operations.
15. The system of claim 11, wherein trace capturing is performed in a kernel of a digital device hosting the emulation environment.
16. The system of claim 11, wherein the modules further comprise instructions to increase or decrease a frequency of a clock signal within the emulation environment.
17. The system of claim 11, wherein the modules further comprise instructions to re-instantiate the at least one of the virtualization environments in response to a detected divergence and instructions to instantiate a modified image of the at least one of the virtualization environments in response to the detected divergence.
18. The system of claim 17, wherein the modules further comprise instructions to apply state information from the emulation environment to the at least one of the re-instantiated virtualization environments.
19. The system of claim 17, wherein the instructions to re-instantiate the at least one of the virtualization environments in response to the detected divergence comprise instructions to halt the at least one of the virtualization environments and restart the at least one of the virtualization environments.
20. The system of claim 17, wherein the modules further comprise instructions to re-instantiate the at least one of the virtualization environments at a state when the divergence is detected between the at least one of the virtualization environments and the emulation environment.
21. A non-transitory computer readable medium comprising instructions, the instructions being executable for performing a method, the method comprising:
in response to a data collector on a first network determining that an object transmitted from a first digital device to a second digital device and intercepted by the data collector is suspicious, receiving at a second network the object from the data collector on the first network;
instantiating a set of virtualization environments, each virtualization environment including a different set of one or more resources based on metadata received from the data collector;
processing the object within the set of virtualization environments;
tracing operations of the object while processing within each of the virtualization environments of the set of virtualization environments to generate a first set of traced operations;
detecting suspicious behavior associated with the object in at least one of the virtualization environments of the set of virtualization environments;
instantiating an emulation environment in response to the detected suspicious behavior in the at least one of the virtualization environments of the set of virtualization environments;
processing the object within the emulation environment;
recording responses to the object within the emulation environment to generate recorded responses;
tracing operations of the object while processing within the emulation environment to generate a second set of traced operations;
determining a likelihood of maliciousness based on at least one of the suspicious behavior associated with the object in the at least one of the virtualization environments, the recorded responses, the first set of traced operations, and the second set of traced operations; and
if the determined likelihood of maliciousness is greater than a threshold, generating a report regarding the object, the report including at least one of the recorded responses, the first set of traced operations, and the second set of traced operations.
US13/288,905 2011-11-03 2011-11-03 Systems and methods for virtualization and emulation assisted malware detection Active US9519781B2 (en)

Priority Applications (9)

Application Number Priority Date Filing Date Title
US13/288,905 US9519781B2 (en) 2011-11-03 2011-11-03 Systems and methods for virtualization and emulation assisted malware detection
EP12845692.8A EP2774039B1 (en) 2011-11-03 2012-11-05 Systems and methods for virtualized malware detection
EP16167215.9A EP3093762B1 (en) 2011-11-03 2012-11-05 Systems and methods for virtualization and emulation assisted malware detection
EP12844780.2A EP2774038B1 (en) 2011-11-03 2012-11-05 Systems and methods for virtualization and emulation assisted malware detection
PCT/US2012/063566 WO2013067505A1 (en) 2011-11-03 2012-11-05 Systems and methods for virtualization and emulation assisted malware detection
CA2854182A CA2854182A1 (en) 2011-11-03 2012-11-05 Systems and methods for virtualization and emulation assisted malware detection
PCT/US2012/063569 WO2013067508A1 (en) 2011-11-03 2012-11-05 Systems and methods for virtualized malware detection
CA2854183A CA2854183A1 (en) 2011-11-03 2012-11-05 Systems and methods for virtualized malware detection
US14/629,444 US9686293B2 (en) 2011-11-03 2015-02-23 Systems and methods for malware detection and mitigation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/288,917 US9792430B2 (en) 2011-11-03 2011-11-03 Systems and methods for virtualized malware detection
US13/288,905 US9519781B2 (en) 2011-11-03 2011-11-03 Systems and methods for virtualization and emulation assisted malware detection

Publications (2)

Publication Number Publication Date
US20130117848A1 US20130117848A1 (en) 2013-05-09
US9519781B2 true US9519781B2 (en) 2016-12-13

Family

ID=48192901

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/288,905 Active US9519781B2 (en) 2011-11-03 2011-11-03 Systems and methods for virtualization and emulation assisted malware detection

Country Status (4)

Country Link
US (1) US9519781B2 (en)
EP (3) EP2774038B1 (en)
CA (2) CA2854183A1 (en)
WO (2) WO2013067508A1 (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160162685A1 (en) * 2014-12-08 2016-06-09 Vmware, Inc. Monitoring application execution in a clone of a virtual computing instance for application whitelisting
US20160352771A1 (en) * 2014-01-27 2016-12-01 Cronus Cyber Technologies Ltd Automated penetration testing device, method and system
US9792430B2 (en) 2011-11-03 2017-10-17 Cyphort Inc. Systems and methods for virtualized malware detection
US10050998B1 (en) * 2015-12-30 2018-08-14 Fireeye, Inc. Malicious message analysis system
US10061924B1 (en) * 2015-12-31 2018-08-28 Symantec Corporation Detecting malicious code based on deviations in executable image import resolutions and load patterns
US10095866B2 (en) 2014-02-24 2018-10-09 Cyphort Inc. System and method for threat risk scoring of security threats
US10225280B2 (en) 2014-02-24 2019-03-05 Cyphort Inc. System and method for verifying and detecting malware
US10228981B2 (en) * 2017-05-02 2019-03-12 Intel Corporation High-performance input-output devices supporting scalable virtualization
US20190104136A1 (en) * 2012-09-28 2019-04-04 Level 3 Communications, Llc Apparatus, system and method for identifying and mitigating malicious network threats
US10268595B1 (en) 2017-10-24 2019-04-23 Red Hat, Inc. Emulating page modification logging for a nested hypervisor
US10326778B2 (en) 2014-02-24 2019-06-18 Cyphort Inc. System and method for detecting lateral movement and data exfiltration
US10335738B1 (en) * 2013-06-24 2019-07-02 Fireeye, Inc. System and method for detecting time-bomb malware
US10509729B2 (en) 2016-01-13 2019-12-17 Intel Corporation Address translation for scalable virtualization of input/output devices
US20200004963A1 (en) * 2018-06-29 2020-01-02 Palo Alto Networks, Inc. Dynamic analysis techniques for applications
US10581874B1 (en) * 2015-12-31 2020-03-03 Fireeye, Inc. Malware detection system with contextual analysis
US10785258B2 (en) 2017-12-01 2020-09-22 At&T Intellectual Property I, L.P. Counter intelligence bot
US10846404B1 (en) 2014-12-18 2020-11-24 Palo Alto Networks, Inc. Collecting algorithmically generated domains
US10853500B2 (en) 2018-08-06 2020-12-01 Xerox Corporation Method and system for identifying virtualized applications
US10867041B2 (en) 2013-07-30 2020-12-15 Palo Alto Networks, Inc. Static and dynamic security analysis of apps for mobile devices
US11010474B2 (en) 2018-06-29 2021-05-18 Palo Alto Networks, Inc. Dynamic analysis techniques for applications
US11184369B2 (en) * 2017-11-13 2021-11-23 Vectra Networks, Inc. Malicious relay and jump-system detection using behavioral indicators of actors
US11196765B2 (en) 2019-09-13 2021-12-07 Palo Alto Networks, Inc. Simulating user interactions for malware analysis
US11405410B2 (en) 2014-02-24 2022-08-02 Cyphort Inc. System and method for detecting lateral movement and data exfiltration
US11431735B2 (en) 2019-01-28 2022-08-30 Orca Security LTD. Techniques for securing virtual machines
US11528288B2 (en) 2018-11-30 2022-12-13 Ovh Service infrastructure and methods of predicting and detecting potential anomalies at the service infrastructure
RU2791824C1 (en) * 2022-02-14 2023-03-13 Групп-Ай Би Глобал Прайвет Лимитед Method and computing device for detecting target malicious web resource
US11669263B1 (en) 2021-03-24 2023-06-06 Meta Platforms, Inc. Systems and methods for low-overhead and flexible trace capture triggered on-demand by hardware events under production load
US11689560B2 (en) 2019-11-25 2023-06-27 Cisco Technology, Inc. Network-wide malware mapping

Families Citing this family (253)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8528086B1 (en) 2004-04-01 2013-09-03 Fireeye, Inc. System and method of detecting computer worms
US8898788B1 (en) 2004-04-01 2014-11-25 Fireeye, Inc. Systems and methods for malware attack prevention
US8566946B1 (en) 2006-04-20 2013-10-22 Fireeye, Inc. Malware containment on connection
US9106694B2 (en) 2004-04-01 2015-08-11 Fireeye, Inc. Electronic message analysis for malware detection
US8584239B2 (en) 2004-04-01 2013-11-12 Fireeye, Inc. Virtual machine with dynamic data flow analysis
US7587537B1 (en) 2007-11-30 2009-09-08 Altera Corporation Serializer-deserializer circuits formed from input-output circuit registers
US8793787B2 (en) 2004-04-01 2014-07-29 Fireeye, Inc. Detecting malicious network content using virtual environment components
US8171553B2 (en) 2004-04-01 2012-05-01 Fireeye, Inc. Heuristic based capture with replay to virtual machine
US8881282B1 (en) 2004-04-01 2014-11-04 Fireeye, Inc. Systems and methods for malware attack detection and identification
US8549638B2 (en) 2004-06-14 2013-10-01 Fireeye, Inc. System and method of containing computer worms
US8997219B2 (en) 2008-11-03 2015-03-31 Fireeye, Inc. Systems and methods for detecting malicious PDF network content
US8832829B2 (en) 2009-09-30 2014-09-09 Fireeye, Inc. Network-based binary file extraction and analysis for malware detection
US8555388B1 (en) 2011-05-24 2013-10-08 Palo Alto Networks, Inc. Heuristic botnet detection
US9686293B2 (en) 2011-11-03 2017-06-20 Cyphort Inc. Systems and methods for malware detection and mitigation
BR112014018826A8 (en) * 2012-01-30 2017-07-11 Intel Corp REMOTE RELIABILITY CERTIFICATION TECHNIQUES AND GEO-LOCATION OF SERVERS AND CLIENTS IN CLOUD COMPUTING ENVIRONMENTS
US9256742B2 (en) * 2012-01-30 2016-02-09 Intel Corporation Remote trust attestation and geo-location of servers and clients in cloud computing environments
US9355107B2 (en) * 2012-04-20 2016-05-31 3Rd Forensic Limited Crime investigation system
US8910138B2 (en) * 2012-05-23 2014-12-09 Oracle International Corporation Hot pluggable extensions for access management system
US20150163234A1 (en) * 2012-05-29 2015-06-11 Six Scan Ltd. System and methods for protecting computing devices from malware attacks
US10607007B2 (en) 2012-07-03 2020-03-31 Hewlett-Packard Development Company, L.P. Micro-virtual machine forensics and detection
US9092625B1 (en) * 2012-07-03 2015-07-28 Bromium, Inc. Micro-virtual machine forensics and detection
US20150161385A1 (en) * 2012-08-10 2015-06-11 Concurix Corporation Memory Management Parameters Derived from System Modeling
US9392003B2 (en) 2012-08-23 2016-07-12 Raytheon Foreground Security, Inc. Internet security cyber threat reporting system and method
US9258321B2 (en) 2012-08-23 2016-02-09 Raytheon Foreground Security, Inc. Automated internet threat detection and mitigation system and associated methods
US9053191B2 (en) * 2012-08-30 2015-06-09 Facebook, Inc. Retroactive search of objects using k-d tree
US9215239B1 (en) 2012-09-28 2015-12-15 Palo Alto Networks, Inc. Malware detection based on traffic analysis
US9104870B1 (en) 2012-09-28 2015-08-11 Palo Alto Networks, Inc. Detecting malware
US9922192B1 (en) 2012-12-07 2018-03-20 Bromium, Inc. Micro-virtual machine forensics and detection
US10572665B2 (en) 2012-12-28 2020-02-25 Fireeye, Inc. System and method to create a number of breakpoints in a virtual machine via virtual machine trapping events
US9195829B1 (en) 2013-02-23 2015-11-24 Fireeye, Inc. User interface with real-time visual playback along with synchronous textual analysis log display and event/time index for anomalous behavior detection in applications
US8990944B1 (en) 2013-02-23 2015-03-24 Fireeye, Inc. Systems and methods for automatically detecting backdoors
US9367681B1 (en) 2013-02-23 2016-06-14 Fireeye, Inc. Framework for efficient security coverage of mobile software applications using symbolic execution to reach regions of interest within an application
US9176843B1 (en) 2013-02-23 2015-11-03 Fireeye, Inc. Framework for efficient security coverage of mobile software applications
US9009823B1 (en) 2013-02-23 2015-04-14 Fireeye, Inc. Framework for efficient security coverage of mobile software applications installed on mobile devices
US9104867B1 (en) 2013-03-13 2015-08-11 Fireeye, Inc. Malicious content analysis using simulated user interaction without user involvement
US9626509B1 (en) 2013-03-13 2017-04-18 Fireeye, Inc. Malicious content analysis with multi-version application support within single operating environment
US9311479B1 (en) 2013-03-14 2016-04-12 Fireeye, Inc. Correlation and consolidation of analytic data for holistic view of a malware attack
US9430646B1 (en) 2013-03-14 2016-08-30 Fireeye, Inc. Distributed systems and methods for automatically detecting unknown bots and botnets
US9690676B2 (en) 2013-03-15 2017-06-27 Aerohive Networks, Inc. Assigning network device subnets to perform network activities using network device information
US9413781B2 (en) 2013-03-15 2016-08-09 Fireeye, Inc. System and method employing structured intelligence to verify and contain threats at endpoints
US10713358B2 (en) 2013-03-15 2020-07-14 Fireeye, Inc. System and method to extract and utilize disassembly features to classify software intent
US9948626B2 (en) 2013-03-15 2018-04-17 Aerohive Networks, Inc. Split authentication network systems and methods
US9495180B2 (en) 2013-05-10 2016-11-15 Fireeye, Inc. Optimized resource allocation for virtual machines within a malware content detection system
US9635039B1 (en) 2013-05-13 2017-04-25 Fireeye, Inc. Classifying sets of malicious indicators for detecting command and control communications associated with malware
WO2014185165A1 (en) * 2013-05-16 2014-11-20 日本電信電話株式会社 Information processing device, and information processing method
US10097567B2 (en) * 2013-05-20 2018-10-09 Nippon Telegraph And Telephone Corporation Information processing apparatus and identifying method
US10133863B2 (en) 2013-06-24 2018-11-20 Fireeye, Inc. Zero-day discovery system
US9300686B2 (en) 2013-06-28 2016-03-29 Fireeye, Inc. System and method for detecting malicious links in electronic messages
US10019575B1 (en) 2013-07-30 2018-07-10 Palo Alto Networks, Inc. Evaluating malware in a virtual machine using copy-on-write
US9613210B1 (en) 2013-07-30 2017-04-04 Palo Alto Networks, Inc. Evaluating malware in a virtual machine using dynamic patching
US10084817B2 (en) 2013-09-11 2018-09-25 NSS Labs, Inc. Malware and exploit campaign detection system and method
US20150089655A1 (en) * 2013-09-23 2015-03-26 Electronics And Telecommunications Research Institute System and method for detecting malware based on virtual host
US9294501B2 (en) 2013-09-30 2016-03-22 Fireeye, Inc. Fuzzy hash of behavioral results
US9628507B2 (en) 2013-09-30 2017-04-18 Fireeye, Inc. Advanced persistent threat (APT) detection center
US10192052B1 (en) * 2013-09-30 2019-01-29 Fireeye, Inc. System, apparatus and method for classifying a file as malicious using static scanning
US10515214B1 (en) 2013-09-30 2019-12-24 Fireeye, Inc. System and method for classifying malware within content created during analysis of a specimen
US9171160B2 (en) 2013-09-30 2015-10-27 Fireeye, Inc. Dynamically adaptive framework and method for classifying malware using intelligent static, emulation, and dynamic analyses
US9690936B1 (en) 2013-09-30 2017-06-27 Fireeye, Inc. Multistage system and method for analyzing obfuscated content for malware
US9736179B2 (en) 2013-09-30 2017-08-15 Fireeye, Inc. System, apparatus and method for using malware analysis results to drive adaptive instrumentation of virtual machines to improve exploit detection
US9921978B1 (en) 2013-11-08 2018-03-20 Fireeye, Inc. System and method for enhanced security of storage devices
US9258324B2 (en) 2013-11-26 2016-02-09 At&T Intellectual Property I, L.P. Methods, systems, and computer program products for protecting a communication network against internet enabled cyber attacks through use of screen replication from controlled internet access points
US9152782B2 (en) * 2013-12-13 2015-10-06 Aerohive Networks, Inc. Systems and methods for user-based network onboarding
US9747446B1 (en) 2013-12-26 2017-08-29 Fireeye, Inc. System and method for run-time object classification
US9756074B2 (en) 2013-12-26 2017-09-05 Fireeye, Inc. System and method for IPS and VM-based detection of suspicious objects
US9756069B1 (en) * 2014-01-10 2017-09-05 Trend Micro Inc. Instant raw scan on host PC with virtualization technology
US9507935B2 (en) 2014-01-16 2016-11-29 Fireeye, Inc. Exploit detection system with threat-aware microvisor
US10430614B2 (en) 2014-01-31 2019-10-01 Bromium, Inc. Automatic initiation of execution analysis
EP3102965B1 (en) * 2014-02-05 2023-07-26 Verve Group, Inc. Methods and apparatus for identification and ranking of synthetic locations for mobile applications
US9262635B2 (en) 2014-02-05 2016-02-16 Fireeye, Inc. Detection efficacy of virtual machine-based analysis with application specific events
WO2015134008A1 (en) * 2014-03-05 2015-09-11 Foreground Security Automated internet threat detection and mitigation system and associated methods
US9241010B1 (en) 2014-03-20 2016-01-19 Fireeye, Inc. System and method for network behavior detection
US10242185B1 (en) 2014-03-21 2019-03-26 Fireeye, Inc. Dynamic guest image creation and rollback
US9591015B1 (en) 2014-03-28 2017-03-07 Fireeye, Inc. System and method for offloading packet processing and static analysis operations
US9223972B1 (en) 2014-03-31 2015-12-29 Fireeye, Inc. Dynamically remote tuning of a malware content detection system
US9432389B1 (en) 2014-03-31 2016-08-30 Fireeye, Inc. System, apparatus and method for detecting a malicious attack based on static analysis of a multi-flow object
US9516054B2 (en) * 2014-04-14 2016-12-06 Trap Data Security Ltd. System and method for cyber threats detection
US9245123B1 (en) 2014-05-07 2016-01-26 Symantec Corporation Systems and methods for identifying malicious files
US9973531B1 (en) 2014-06-06 2018-05-15 Fireeye, Inc. Shellcode detection
US9438623B1 (en) 2014-06-06 2016-09-06 Fireeye, Inc. Computer exploit detection using heap spray pattern matching
US9594912B1 (en) 2014-06-06 2017-03-14 Fireeye, Inc. Return-oriented programming detection
US10084813B2 (en) 2014-06-24 2018-09-25 Fireeye, Inc. Intrusion prevention and remedy system
US9652615B1 (en) * 2014-06-25 2017-05-16 Symantec Corporation Systems and methods for analyzing suspected malware
US10805340B1 (en) 2014-06-26 2020-10-13 Fireeye, Inc. Infection vector and malware tracking with an interactive user display
US9398028B1 (en) 2014-06-26 2016-07-19 Fireeye, Inc. System, device and method for detecting a malicious attack based on communcations between remotely hosted virtual machines and malicious web servers
US10002252B2 (en) 2014-07-01 2018-06-19 Fireeye, Inc. Verification of trusted threat-aware microvisor
US9489516B1 (en) 2014-07-14 2016-11-08 Palo Alto Networks, Inc. Detection of malware using an instrumented virtual machine environment
US9363280B1 (en) 2014-08-22 2016-06-07 Fireeye, Inc. System and method of detecting delivery of malware using cross-customer data
US10671726B1 (en) 2014-09-22 2020-06-02 Fireeye Inc. System and method for malware analysis using thread-level event monitoring
US9773112B1 (en) 2014-09-29 2017-09-26 Fireeye, Inc. Exploit detection of malware and malware families
US10027689B1 (en) 2014-09-29 2018-07-17 Fireeye, Inc. Interactive infection visualization for improved exploit detection and signature generation for malware and malware families
US10044675B1 (en) 2014-09-30 2018-08-07 Palo Alto Networks, Inc. Integrating a honey network with a target network to counter IP and peer-checking evasion techniques
US9716727B1 (en) 2014-09-30 2017-07-25 Palo Alto Networks, Inc. Generating a honey network configuration to emulate a target network environment
US9860208B1 (en) 2014-09-30 2018-01-02 Palo Alto Networks, Inc. Bridging a virtual clone of a target device in a honey network to a suspicious device in an enterprise network
US9882929B1 (en) 2014-09-30 2018-01-30 Palo Alto Networks, Inc. Dynamic selection and generation of a virtual clone for detonation of suspicious content within a honey network
US9495188B1 (en) * 2014-09-30 2016-11-15 Palo Alto Networks, Inc. Synchronizing a honey network configuration to reflect a target network environment
US9781144B1 (en) * 2014-09-30 2017-10-03 Fireeye, Inc. Determining duplicate objects for malware analysis using environmental/context information
CN106796635B (en) * 2014-10-14 2019-10-22 日本电信电话株式会社 Determining device determines method
US9413774B1 (en) 2014-10-27 2016-08-09 Palo Alto Networks, Inc. Dynamic malware analysis of a URL using a browser executed in an instrumented virtual machine environment
CN105678164B (en) * 2014-11-20 2018-08-14 华为技术有限公司 Detect the method and device of Malware
US9542554B1 (en) 2014-12-18 2017-01-10 Palo Alto Networks, Inc. Deduplicating malware
US9690933B1 (en) 2014-12-22 2017-06-27 Fireeye, Inc. Framework for classifying an object as malicious with machine learning for deploying updated predictive models
US10075455B2 (en) 2014-12-26 2018-09-11 Fireeye, Inc. Zero-day rotating guest image profile
US9934376B1 (en) 2014-12-29 2018-04-03 Fireeye, Inc. Malware detection appliance architecture
US9838417B1 (en) 2014-12-30 2017-12-05 Fireeye, Inc. Intelligent context aware user interaction for malware detection
US9189630B1 (en) * 2015-01-21 2015-11-17 AO Kaspersky Lab Systems and methods for active operating system kernel protection
US9787695B2 (en) 2015-03-24 2017-10-10 Qualcomm Incorporated Methods and systems for identifying malware through differences in cloud vs. client behavior
US10148693B2 (en) 2015-03-25 2018-12-04 Fireeye, Inc. Exploit detection system
US9690606B1 (en) 2015-03-25 2017-06-27 Fireeye, Inc. Selective system call monitoring
US9438613B1 (en) 2015-03-30 2016-09-06 Fireeye, Inc. Dynamic content activation for automated analysis of embedded objects
US10474813B1 (en) 2015-03-31 2019-11-12 Fireeye, Inc. Code injection technique for remediation at an endpoint of a network
US10417031B2 (en) 2015-03-31 2019-09-17 Fireeye, Inc. Selective virtualization for security threat detection
US9483644B1 (en) 2015-03-31 2016-11-01 Fireeye, Inc. Methods for detecting file altering malware in VM based analysis
US9654485B1 (en) 2015-04-13 2017-05-16 Fireeye, Inc. Analytics-based security monitoring system and method
US9594904B1 (en) 2015-04-23 2017-03-14 Fireeye, Inc. Detecting malware based on reflection
US10642753B1 (en) 2015-06-30 2020-05-05 Fireeye, Inc. System and method for protecting a software component running in virtual machine using a virtualization layer
US10216927B1 (en) 2015-06-30 2019-02-26 Fireeye, Inc. System and method for protecting memory pages associated with a process using a virtualization layer
US10726127B1 (en) 2015-06-30 2020-07-28 Fireeye, Inc. System and method for protecting a software component running in a virtual machine through virtual interrupts by the virtualization layer
US10395029B1 (en) 2015-06-30 2019-08-27 Fireeye, Inc. Virtual system and method with threat protection
US11113086B1 (en) 2015-06-30 2021-09-07 Fireeye, Inc. Virtual system and method for securing external network connectivity
US10454950B1 (en) 2015-06-30 2019-10-22 Fireeye, Inc. Centralized aggregation technique for detecting lateral movement of stealthy cyber-attacks
US9606854B2 (en) 2015-08-13 2017-03-28 At&T Intellectual Property I, L.P. Insider attack resistant system and method for cloud services integrity checking
US10715542B1 (en) 2015-08-14 2020-07-14 Fireeye, Inc. Mobile application risk analysis
US10176321B2 (en) 2015-09-22 2019-01-08 Fireeye, Inc. Leveraging behavior-based rules for malware family classification
US10033759B1 (en) 2015-09-28 2018-07-24 Fireeye, Inc. System and method of threat detection under hypervisor control
US10033747B1 (en) 2015-09-29 2018-07-24 Fireeye, Inc. System and method for detecting interpreter-based exploit attacks
US10706149B1 (en) 2015-09-30 2020-07-07 Fireeye, Inc. Detecting delayed activation malware using a primary controller and plural time controllers
US10320828B1 (en) * 2015-09-30 2019-06-11 EMC IP Holding Company LLC Evaluation of security in a cyber simulator
US10817606B1 (en) 2015-09-30 2020-10-27 Fireeye, Inc. Detecting delayed activation malware using a run-time monitoring agent and time-dilation logic
US9825989B1 (en) 2015-09-30 2017-11-21 Fireeye, Inc. Cyber attack early warning system
US9825976B1 (en) 2015-09-30 2017-11-21 Fireeye, Inc. Detection and classification of exploit kits
US10601865B1 (en) 2015-09-30 2020-03-24 Fireeye, Inc. Detection of credential spearphishing attacks using email analysis
US10210329B1 (en) 2015-09-30 2019-02-19 Fireeye, Inc. Method to detect application execution hijacking using memory protection
US10320814B2 (en) 2015-10-02 2019-06-11 Trend Micro Incorporated Detection of advanced persistent threat attack on a private computer network
US9977896B2 (en) 2015-10-08 2018-05-22 Digital Guardian, Inc. Systems and methods for generating policies for an application using a virtualized environment
US10284575B2 (en) 2015-11-10 2019-05-07 Fireeye, Inc. Launcher for setting analysis environment variations for malware detection
US9984231B2 (en) * 2015-11-11 2018-05-29 Qualcomm Incorporated Detecting program evasion of virtual machines or emulators
EP3387517A4 (en) 2015-12-07 2019-05-15 Prismo Systems Inc. Systems and methods for detecting and responding to security threats using application execution and connection lineage tracing
US10447728B1 (en) 2015-12-10 2019-10-15 Fireeye, Inc. Technique for protecting guest processes using a layered virtualization architecture
US10846117B1 (en) 2015-12-10 2020-11-24 Fireeye, Inc. Technique for establishing secure communication between host and guest processes of a virtualization architecture
US10108446B1 (en) 2015-12-11 2018-10-23 Fireeye, Inc. Late load technique for deploying a virtualization layer underneath a running operating system
US10565378B1 (en) 2015-12-30 2020-02-18 Fireeye, Inc. Exploit of privilege detection framework
US10133866B1 (en) 2015-12-30 2018-11-20 Fireeye, Inc. System and method for triggering analysis of an object for malware in response to modification of that object
US10621338B1 (en) 2015-12-30 2020-04-14 Fireeye, Inc. Method to detect forgery and exploits using last branch recording registers
US9824216B1 (en) 2015-12-31 2017-11-21 Fireeye, Inc. Susceptible environment detection system
US11552986B1 (en) 2015-12-31 2023-01-10 Fireeye Security Holdings Us Llc Cyber-security framework for application of virtual features
US10601863B1 (en) 2016-03-25 2020-03-24 Fireeye, Inc. System and method for managing sensor enrollment
US10671721B1 (en) 2016-03-25 2020-06-02 Fireeye, Inc. Timeout management services
US10785255B1 (en) 2016-03-25 2020-09-22 Fireeye, Inc. Cluster configuration within a scalable malware detection system
US10476906B1 (en) 2016-03-25 2019-11-12 Fireeye, Inc. System and method for managing formation and modification of a cluster within a malware detection system
US10893059B1 (en) 2016-03-31 2021-01-12 Fireeye, Inc. Verification and enhancement using detection systems located at the network periphery and endpoint devices
US10476752B2 (en) * 2016-04-04 2019-11-12 Nec Corporation Blue print graphs for fusing of heterogeneous alerts
US10476749B2 (en) * 2016-04-04 2019-11-12 Nec Corporation Graph-based fusing of heterogeneous alerts
US10169585B1 (en) 2016-06-22 2019-01-01 Fireeye, Inc. System and methods for advanced malware detection through placement of transition events
US10462173B1 (en) 2016-06-30 2019-10-29 Fireeye, Inc. Malware detection verification and enhancement by coordinating endpoint and malware detection systems
US10482239B1 (en) 2016-06-30 2019-11-19 Palo Alto Networks, Inc. Rendering an object using muliple versions of an application in a single process for dynamic malware analysis
RU2649793C2 (en) 2016-08-03 2018-04-04 ООО "Группа АйБи" Method and system of detecting remote connection when working on web resource pages
WO2018039792A1 (en) * 2016-08-31 2018-03-08 Wedge Networks Inc. Apparatus and methods for network-based line-rate detection of unknown malware
US10592678B1 (en) 2016-09-09 2020-03-17 Fireeye, Inc. Secure communications between peers using a verified virtual trusted platform module
RU2634209C1 (en) 2016-09-19 2017-10-24 Общество с ограниченной ответственностью "Группа АйБи ТДС" System and method of autogeneration of decision rules for intrusion detection systems with feedback
US10491627B1 (en) 2016-09-29 2019-11-26 Fireeye, Inc. Advanced malware detection using similarity analysis
CN107979581B (en) * 2016-10-25 2020-10-27 华为技术有限公司 Detection method and device for zombie characteristics
US10795991B1 (en) 2016-11-08 2020-10-06 Fireeye, Inc. Enterprise search
US10587647B1 (en) 2016-11-22 2020-03-10 Fireeye, Inc. Technique for malware detection capability comparison of network security devices
US10552610B1 (en) 2016-12-22 2020-02-04 Fireeye, Inc. Adaptive virtual machine snapshot update framework for malware behavioral analysis
US10581879B1 (en) 2016-12-22 2020-03-03 Fireeye, Inc. Enhanced malware detection for generated objects
US10523609B1 (en) 2016-12-27 2019-12-31 Fireeye, Inc. Multi-vector malware detection and analysis
RU2637477C1 (en) 2016-12-29 2017-12-04 Общество с ограниченной ответственностью "Траст" System and method for detecting phishing web pages
RU2671991C2 (en) 2016-12-29 2018-11-08 Общество с ограниченной ответственностью "Траст" System and method for collecting information for detecting phishing
US10185645B2 (en) * 2017-03-08 2019-01-22 Microsoft Technology Licensing, Llc Resource lifetime analysis using a time-travel trace
US9934127B1 (en) 2017-03-08 2018-04-03 Microsoft Technology Licensing, Llc Indexing a trace by insertion of key frames for replay responsiveness
US10904286B1 (en) 2017-03-24 2021-01-26 Fireeye, Inc. Detection of phishing attacks using similarity analysis
US10554507B1 (en) 2017-03-30 2020-02-04 Fireeye, Inc. Multi-level control for enhanced resource and object evaluation management of malware detection system
US10902119B1 (en) 2017-03-30 2021-01-26 Fireeye, Inc. Data extraction system for malware analysis
US10798112B2 (en) 2017-03-30 2020-10-06 Fireeye, Inc. Attribute-controlled malware detection
US10791138B1 (en) 2017-03-30 2020-09-29 Fireeye, Inc. Subscription-based malware detection
US10855700B1 (en) 2017-06-29 2020-12-01 Fireeye, Inc. Post-intrusion detection of cyber-attacks during lateral movement within networks
US10601848B1 (en) 2017-06-29 2020-03-24 Fireeye, Inc. Cyber-security system and method for weak indicator detection and correlation to generate strong indicators
US10503904B1 (en) 2017-06-29 2019-12-10 Fireeye, Inc. Ransomware detection and mitigation
US10893068B1 (en) 2017-06-30 2021-01-12 Fireeye, Inc. Ransomware file modification prevention technique
US10747872B1 (en) 2017-09-27 2020-08-18 Fireeye, Inc. System and method for preventing malware evasion
US10805346B2 (en) 2017-10-01 2020-10-13 Fireeye, Inc. Phishing attack detection
US11108809B2 (en) 2017-10-27 2021-08-31 Fireeye, Inc. System and method for analyzing binary code for malware classification using artificial neural network techniques
RU2689816C2 (en) 2017-11-21 2019-05-29 ООО "Группа АйБи" Method for classifying sequence of user actions (embodiments)
US11271955B2 (en) 2017-12-28 2022-03-08 Fireeye Security Holdings Us Llc Platform and method for retroactive reclassification employing a cybersecurity-based global data store
US11005860B1 (en) 2017-12-28 2021-05-11 Fireeye, Inc. Method and system for efficient cybersecurity analysis of endpoint events
US11240275B1 (en) 2017-12-28 2022-02-01 Fireeye Security Holdings Us Llc Platform and method for performing cybersecurity analyses employing an intelligence hub with a modular architecture
RU2668710C1 (en) 2018-01-17 2018-10-02 Общество с ограниченной ответственностью "Группа АйБи ТДС" Computing device and method for detecting malicious domain names in network traffic
RU2676247C1 (en) 2018-01-17 2018-12-26 Общество С Ограниченной Ответственностью "Группа Айби" Web resources clustering method and computer device
RU2680736C1 (en) 2018-01-17 2019-02-26 Общество с ограниченной ответственностью "Группа АйБи ТДС" Malware files in network traffic detection server and method
RU2677361C1 (en) 2018-01-17 2019-01-16 Общество с ограниченной ответственностью "Траст" Method and system of decentralized identification of malware programs
RU2677368C1 (en) 2018-01-17 2019-01-16 Общество С Ограниченной Ответственностью "Группа Айби" Method and system for automatic determination of fuzzy duplicates of video content
RU2756186C2 (en) * 2018-02-06 2021-09-28 Акционерное общество "Лаборатория Касперского" System and method for categorization of .net applications
RU2681699C1 (en) 2018-02-13 2019-03-12 Общество с ограниченной ответственностью "Траст" Method and server for searching related network resources
US10671725B2 (en) 2018-03-20 2020-06-02 Didi Research America, Llc Malicious process tracking
CA3094338A1 (en) * 2018-03-26 2019-10-03 Virsec Systems, Inc. Trusted execution security policy platform
US10826931B1 (en) 2018-03-29 2020-11-03 Fireeye, Inc. System and method for predicting and mitigating cybersecurity system misconfigurations
US11558401B1 (en) 2018-03-30 2023-01-17 Fireeye Security Holdings Us Llc Multi-vector malware detection data sharing system for improved detection
US11003773B1 (en) 2018-03-30 2021-05-11 Fireeye, Inc. System and method for automatically generating malware detection rule recommendations
US10956477B1 (en) 2018-03-30 2021-03-23 Fireeye, Inc. System and method for detecting malicious scripts through natural language processing modeling
US10698426B2 (en) 2018-05-07 2020-06-30 Mks Instruments, Inc. Methods and apparatus for multiple channel mass flow and ratio control systems
US11075930B1 (en) 2018-06-27 2021-07-27 Fireeye, Inc. System and method for detecting repetitive cybersecurity attacks constituting an email campaign
US11314859B1 (en) 2018-06-27 2022-04-26 FireEye Security Holdings, Inc. Cyber-security system and method for detecting escalation of privileges within an access token
US11228491B1 (en) 2018-06-28 2022-01-18 Fireeye Security Holdings Us Llc System and method for distributed cluster configuration monitoring and management
US11316900B1 (en) 2018-06-29 2022-04-26 FireEye Security Holdings Inc. System and method for automatically prioritizing rules for cyber-threat detection and mitigation
US10757093B1 (en) * 2018-08-31 2020-08-25 Splunk Inc. Identification of runtime credential requirements
US10853478B1 (en) 2018-08-31 2020-12-01 Splunk Inc. Encrypted storage and provision of authentication information for use when responding to an information technology incident
US10938838B2 (en) * 2018-08-31 2021-03-02 Sophos Limited Computer augmented threat evaluation
US11182473B1 (en) 2018-09-13 2021-11-23 Fireeye Security Holdings Us Llc System and method for mitigating cyberattacks against processor operability by a guest process
US11763004B1 (en) 2018-09-27 2023-09-19 Fireeye Security Holdings Us Llc System and method for bootkit detection
RU2708508C1 (en) 2018-12-17 2019-12-09 Общество с ограниченной ответственностью "Траст" Method and a computing device for detecting suspicious users in messaging systems
US11176251B1 (en) 2018-12-21 2021-11-16 Fireeye, Inc. Determining malware via symbolic function hash analysis
US11368475B1 (en) 2018-12-21 2022-06-21 Fireeye Security Holdings Us Llc System and method for scanning remote services to locate stored objects with malware
US11743290B2 (en) 2018-12-21 2023-08-29 Fireeye Security Holdings Us Llc System and method for detecting cyberattacks impersonating legitimate sources
CN111368295A (en) * 2018-12-26 2020-07-03 中兴通讯股份有限公司 Malicious sample detection method, device and system and storage medium
RU2701040C1 (en) 2018-12-28 2019-09-24 Общество с ограниченной ответственностью "Траст" Method and a computer for informing on malicious web resources
US11601444B1 (en) 2018-12-31 2023-03-07 Fireeye Security Holdings Us Llc Automated system for triage of customer issues
EP3842968A4 (en) 2019-02-27 2022-03-16 "Group IB" Ltd. Method and system for identifying a user according to keystroke dynamics
US11310238B1 (en) 2019-03-26 2022-04-19 FireEye Security Holdings, Inc. System and method for retrieval and analysis of operational data from customer, cloud-hosted virtual resources
US11409867B2 (en) 2019-03-28 2022-08-09 Juniper Networks, Inc. Behavioral detection of malicious scripts
US11677786B1 (en) 2019-03-29 2023-06-13 Fireeye Security Holdings Us Llc System and method for detecting and protecting against cybersecurity attacks on servers
US11636198B1 (en) 2019-03-30 2023-04-25 Fireeye Security Holdings Us Llc System and method for cybersecurity analyzer update and concurrent management system
US11736499B2 (en) 2019-04-09 2023-08-22 Corner Venture Partners, Llc Systems and methods for detecting injection exploits
US11258806B1 (en) 2019-06-24 2022-02-22 Mandiant, Inc. System and method for automatically associating cybersecurity intelligence to cyberthreat actors
US11556640B1 (en) 2019-06-27 2023-01-17 Mandiant, Inc. Systems and methods for automated cybersecurity analysis of extracted binary string sets
US11392700B1 (en) 2019-06-28 2022-07-19 Fireeye Security Holdings Us Llc System and method for supporting cross-platform data verification
CN111030981B (en) * 2019-08-13 2023-04-28 北京安天网络安全技术有限公司 Method, system and storage device for blocking continuous attack of malicious file
US11736498B1 (en) 2019-08-29 2023-08-22 Trend Micro Incorporated Stateful detection of cyberattacks
US11886585B1 (en) 2019-09-27 2024-01-30 Musarubra Us Llc System and method for identifying and mitigating cyberattacks through malicious position-independent code execution
US11637862B1 (en) 2019-09-30 2023-04-25 Mandiant, Inc. System and method for surfacing cyber-security threats with a self-learning recommendation engine
RU2728497C1 (en) 2019-12-05 2020-07-29 Общество с ограниченной ответственностью "Группа АйБи ТДС" Method and system for determining belonging of software by its machine code
RU2728498C1 (en) 2019-12-05 2020-07-29 Общество с ограниченной ответственностью "Группа АйБи ТДС" Method and system for determining software belonging by its source code
US11271907B2 (en) 2019-12-19 2022-03-08 Palo Alto Networks, Inc. Smart proxy for a large scale high-interaction honeypot farm
US11265346B2 (en) 2019-12-19 2022-03-01 Palo Alto Networks, Inc. Large scale high-interactive honeypot farm
RU2743974C1 (en) 2019-12-19 2021-03-01 Общество с ограниченной ответственностью "Группа АйБи ТДС" System and method for scanning security of elements of network architecture
US11436327B1 (en) 2019-12-24 2022-09-06 Fireeye Security Holdings Us Llc System and method for circumventing evasive code for cyberthreat detection
US11838300B1 (en) 2019-12-24 2023-12-05 Musarubra Us Llc Run-time configurable cybersecurity system
US11522884B1 (en) 2019-12-24 2022-12-06 Fireeye Security Holdings Us Llc Subscription and key management system
SG10202001963TA (en) 2020-03-04 2021-10-28 Group Ib Global Private Ltd System and method for brand protection based on the search results
US11681804B2 (en) 2020-03-09 2023-06-20 Commvault Systems, Inc. System and method for automatic generation of malware detection traps
US11263109B2 (en) * 2020-04-16 2022-03-01 Bank Of America Corporation Virtual environment system for validating executable data using accelerated time-based process execution
US11425123B2 (en) 2020-04-16 2022-08-23 Bank Of America Corporation System for network isolation of affected computing systems using environment hash outputs
US11528276B2 (en) 2020-04-16 2022-12-13 Bank Of America Corporation System for prevention of unauthorized access using authorized environment hash outputs
US11423160B2 (en) 2020-04-16 2022-08-23 Bank Of America Corporation System for analysis and authorization for use of executable environment data in a computing system using hash outputs
US11481484B2 (en) 2020-04-16 2022-10-25 Bank Of America Corporation Virtual environment system for secure execution of program code using cryptographic hashes
US11372982B2 (en) 2020-07-02 2022-06-28 Bank Of America Corporation Centralized network environment for processing validated executable data based on authorized hash outputs
US11475090B2 (en) 2020-07-15 2022-10-18 Group-Ib Global Private Limited Method and system for identifying clusters of affiliated web resources
RU2744438C1 (en) * 2020-08-03 2021-03-09 Акционерное Общество «Эшелон - Северо-Запад» System and method for forming optimal set of tests for identifying software bugs
RU2743619C1 (en) 2020-08-06 2021-02-20 Общество с ограниченной ответственностью "Группа АйБи ТДС" Method and system for generating the list of compromise indicators
JP7373803B2 (en) * 2020-09-29 2023-11-06 パナソニックIpマネジメント株式会社 Information transmitting device, server, and information transmitting method
US20220269785A1 (en) * 2021-02-23 2022-08-25 Saudi Arabian Oil Company Enhanced cybersecurity analysis for malicious files detected at the endpoint level
WO2022191843A1 (en) * 2021-03-11 2022-09-15 Hewlett-Packard Development Company, L.P. Instructions to process files in virtual machines
US11947572B2 (en) 2021-03-29 2024-04-02 Group IB TDS, Ltd Method and system for clustering executable files
US20230297687A1 (en) * 2022-03-21 2023-09-21 Vmware, Inc. Opportunistic hardening of files to remediate security threats posed by malicious applications

Citations (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050081053A1 (en) 2003-10-10 2005-04-14 International Business Machines Corlporation Systems and methods for efficient computer virus detection
US20060010440A1 (en) 2004-07-07 2006-01-12 Anderson Andrew V Optimizing system behavior in a virtual machine environment
US20060161982A1 (en) * 2005-01-18 2006-07-20 Chari Suresh N Intrusion detection system
US20070244987A1 (en) 2006-04-12 2007-10-18 Pedersen Bradley J Systems and Methods for Accelerating Delivery of a Computing Environment to a Remote User
US20070250930A1 (en) 2004-04-01 2007-10-25 Ashar Aziz Virtual machine with dynamic data flow analysis
US20080086776A1 (en) 2006-10-06 2008-04-10 George Tuvell System and method of malware sample collection on mobile networks
US7418729B2 (en) * 2002-07-19 2008-08-26 Symantec Corporation Heuristic detection of malicious computer code by page tracking
US20090077544A1 (en) 2007-09-14 2009-03-19 International Business Machines Corporation Method, system and program product for optimizing emulation of a suspected malware
US7540030B1 (en) * 2008-09-15 2009-05-26 Kaspersky Lab, Zao Method and system for automatic cure against malware
US7664626B1 (en) * 2006-03-24 2010-02-16 Symantec Corporation Ambiguous-state support in virtual machine emulators
US20100064299A1 (en) 2008-09-09 2010-03-11 Kace Networks, Inc. Deployment and Management of Virtual Containers
US20100115621A1 (en) 2008-11-03 2010-05-06 Stuart Gresley Staniford Systems and Methods for Detecting Malicious Network Content
US20100192223A1 (en) 2004-04-01 2010-07-29 Osman Abdoul Ismael Detecting Malicious Network Content Using Virtual Environment Components
US20110041179A1 (en) 2009-08-11 2011-02-17 F-Secure Oyj Malware detection
US20110054879A1 (en) * 2009-08-27 2011-03-03 Ibm Corporation Accelerated Execution for Emulated Environments
US20110055123A1 (en) 2009-08-31 2011-03-03 Symantec Corporation Systems and Methods for Using Multiple In-line Heuristics to Reduce False Positives
US20110145926A1 (en) 2009-12-15 2011-06-16 Mcafee, Inc. Systems and methods for behavioral sandboxing
US20110167494A1 (en) 2009-12-31 2011-07-07 Bowen Brian M Methods, systems, and media for detecting covert malware
US20110225655A1 (en) * 2010-03-15 2011-09-15 F-Secure Oyj Malware protection
US20110271343A1 (en) 2010-04-28 2011-11-03 Electronics And Telecommunications Research Institute Apparatus, system and method for detecting malicious code
US8060074B2 (en) * 2007-07-30 2011-11-15 Mobile Iron, Inc. Virtual instance architecture for mobile device management systems
US20110302656A1 (en) 2009-02-24 2011-12-08 Fadi El-Moussa Detecting malicious behaviour on a computer network
US8108912B2 (en) * 2008-05-29 2012-01-31 Red Hat, Inc. Systems and methods for management of secure data in cloud-based network
US8151352B1 (en) * 2006-07-14 2012-04-03 Bitdefender IPR Managament Ltd. Anti-malware emulation systems and methods
US20120110672A1 (en) 2010-05-14 2012-05-03 Mcafee, Inc. Systems and methods for classification of messaging entities
US8204984B1 (en) * 2004-04-01 2012-06-19 Fireeye, Inc. Systems and methods for detecting encrypted bot command and control communication channels
US8266698B1 (en) 2009-03-09 2012-09-11 Symantec Corporation Using machine infection characteristics for behavior-based detection of malware
US8375450B1 (en) 2009-10-05 2013-02-12 Trend Micro, Inc. Zero day malware scanner
US8375444B2 (en) 2006-04-20 2013-02-12 Fireeye, Inc. Dynamic signature creation and enforcement
US20130097706A1 (en) 2011-09-16 2013-04-18 Veracode, Inc. Automated behavioral and static analysis using an instrumented sandbox and machine learning classification for mobile security
US20130117849A1 (en) 2011-11-03 2013-05-09 Ali Golshan Systems and Methods for Virtualized Malware Detection
US8516589B2 (en) * 2009-04-07 2013-08-20 Samsung Electronics Co., Ltd. Apparatus and method for preventing virus code execution
US20130227691A1 (en) 2012-02-24 2013-08-29 Ashar Aziz Detecting Malicious Network Content
US20130263260A1 (en) 2008-10-21 2013-10-03 Lookout, Inc. System and method for assessing an application to be installed on a mobile communication device
US20130276114A1 (en) 2012-02-29 2013-10-17 Sourcefire, Inc. Method and apparatus for retroactively detecting malicious or otherwise undesirable software
US20130298244A1 (en) 2012-05-01 2013-11-07 Taasera, Inc. Systems and methods for threat identification and remediation
US20130318568A1 (en) 2008-10-21 2013-11-28 Lookout Inc. Assessing a data object based on application data associated with the data object
US20140090061A1 (en) 2012-09-26 2014-03-27 Northrop Grumman Systems Corporation System and method for automated machine-learning, zero-day malware detection
US20140096251A1 (en) 2012-09-28 2014-04-03 Level 3 Communications, Llc Apparatus, system and method for identifying and mitigating malicious network threats
US8751490B1 (en) 2011-03-31 2014-06-10 Symantec Corporation Automatically determining reputations of physical locations
US8769683B1 (en) 2009-07-07 2014-07-01 Trend Micro Incorporated Apparatus and methods for remote classification of unknown malware
US20150007312A1 (en) 2013-06-28 2015-01-01 Vinay Pidathala System and method for detecting malicious links in electronic messages
US8984581B2 (en) * 2011-07-27 2015-03-17 Seven Networks, Inc. Monitoring mobile application activities for malicious traffic on a mobile device
US20150106927A1 (en) 2013-10-14 2015-04-16 Ut-Battelle, Llc Real-time detection and classification of anomalous events in streaming data
US20150128263A1 (en) 2013-11-07 2015-05-07 Cyberpoint International, LLC Methods and systems for malware detection
US20150135262A1 (en) 2012-05-03 2015-05-14 Shine Security Ltd. Detection and prevention for malicious threats
US20150172300A1 (en) 2013-12-17 2015-06-18 Hoplite Industries, Inc. Behavioral model based malware protection system and method
US20150180883A1 (en) 2013-10-22 2015-06-25 Erdem Aktas Control flow graph representation and classification
US20150244732A1 (en) 2011-11-03 2015-08-27 Cyphort Inc. Systems And Methods For Malware Detection And Mitigation
US20150244730A1 (en) 2014-02-24 2015-08-27 Cyphort Inc. System And Method For Verifying And Detecting Malware
US20160065601A1 (en) 2014-02-24 2016-03-03 Cyphort Inc. System And Method For Detecting Lateral Movement And Data Exfiltration
US20160078229A1 (en) 2014-02-24 2016-03-17 Cyphort Inc. System And Method For Threat Risk Scoring Of Security Threats

Patent Citations (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7418729B2 (en) * 2002-07-19 2008-08-26 Symantec Corporation Heuristic detection of malicious computer code by page tracking
US20050081053A1 (en) 2003-10-10 2005-04-14 International Business Machines Corlporation Systems and methods for efficient computer virus detection
US20100192223A1 (en) 2004-04-01 2010-07-29 Osman Abdoul Ismael Detecting Malicious Network Content Using Virtual Environment Components
US20070250930A1 (en) 2004-04-01 2007-10-25 Ashar Aziz Virtual machine with dynamic data flow analysis
US8204984B1 (en) * 2004-04-01 2012-06-19 Fireeye, Inc. Systems and methods for detecting encrypted bot command and control communication channels
US20060010440A1 (en) 2004-07-07 2006-01-12 Anderson Andrew V Optimizing system behavior in a virtual machine environment
US20060161982A1 (en) * 2005-01-18 2006-07-20 Chari Suresh N Intrusion detection system
US7664626B1 (en) * 2006-03-24 2010-02-16 Symantec Corporation Ambiguous-state support in virtual machine emulators
US20070244987A1 (en) 2006-04-12 2007-10-18 Pedersen Bradley J Systems and Methods for Accelerating Delivery of a Computing Environment to a Remote User
US8375444B2 (en) 2006-04-20 2013-02-12 Fireeye, Inc. Dynamic signature creation and enforcement
US8407797B1 (en) * 2006-07-14 2013-03-26 Bitdefender IPR Management Ltd. Anti-malware emulation systems and methods
US8151352B1 (en) * 2006-07-14 2012-04-03 Bitdefender IPR Managament Ltd. Anti-malware emulation systems and methods
US20080086776A1 (en) 2006-10-06 2008-04-10 George Tuvell System and method of malware sample collection on mobile networks
US8060074B2 (en) * 2007-07-30 2011-11-15 Mobile Iron, Inc. Virtual instance architecture for mobile device management systems
US8176477B2 (en) * 2007-09-14 2012-05-08 International Business Machines Corporation Method, system and program product for optimizing emulation of a suspected malware
US20090077544A1 (en) 2007-09-14 2009-03-19 International Business Machines Corporation Method, system and program product for optimizing emulation of a suspected malware
US8108912B2 (en) * 2008-05-29 2012-01-31 Red Hat, Inc. Systems and methods for management of secure data in cloud-based network
US20100064299A1 (en) 2008-09-09 2010-03-11 Kace Networks, Inc. Deployment and Management of Virtual Containers
US7540030B1 (en) * 2008-09-15 2009-05-26 Kaspersky Lab, Zao Method and system for automatic cure against malware
US20130263260A1 (en) 2008-10-21 2013-10-03 Lookout, Inc. System and method for assessing an application to be installed on a mobile communication device
US20130318568A1 (en) 2008-10-21 2013-11-28 Lookout Inc. Assessing a data object based on application data associated with the data object
US20100115621A1 (en) 2008-11-03 2010-05-06 Stuart Gresley Staniford Systems and Methods for Detecting Malicious Network Content
US20110302656A1 (en) 2009-02-24 2011-12-08 Fadi El-Moussa Detecting malicious behaviour on a computer network
US8266698B1 (en) 2009-03-09 2012-09-11 Symantec Corporation Using machine infection characteristics for behavior-based detection of malware
US8516589B2 (en) * 2009-04-07 2013-08-20 Samsung Electronics Co., Ltd. Apparatus and method for preventing virus code execution
US8769683B1 (en) 2009-07-07 2014-07-01 Trend Micro Incorporated Apparatus and methods for remote classification of unknown malware
US20110041179A1 (en) 2009-08-11 2011-02-17 F-Secure Oyj Malware detection
US20110054879A1 (en) * 2009-08-27 2011-03-03 Ibm Corporation Accelerated Execution for Emulated Environments
US20110055123A1 (en) 2009-08-31 2011-03-03 Symantec Corporation Systems and Methods for Using Multiple In-line Heuristics to Reduce False Positives
US8375450B1 (en) 2009-10-05 2013-02-12 Trend Micro, Inc. Zero day malware scanner
US20110145926A1 (en) 2009-12-15 2011-06-16 Mcafee, Inc. Systems and methods for behavioral sandboxing
US20110167494A1 (en) 2009-12-31 2011-07-07 Bowen Brian M Methods, systems, and media for detecting covert malware
US20110225655A1 (en) * 2010-03-15 2011-09-15 F-Secure Oyj Malware protection
US20110271343A1 (en) 2010-04-28 2011-11-03 Electronics And Telecommunications Research Institute Apparatus, system and method for detecting malicious code
US20120110672A1 (en) 2010-05-14 2012-05-03 Mcafee, Inc. Systems and methods for classification of messaging entities
US8751490B1 (en) 2011-03-31 2014-06-10 Symantec Corporation Automatically determining reputations of physical locations
US8984581B2 (en) * 2011-07-27 2015-03-17 Seven Networks, Inc. Monitoring mobile application activities for malicious traffic on a mobile device
US20130097706A1 (en) 2011-09-16 2013-04-18 Veracode, Inc. Automated behavioral and static analysis using an instrumented sandbox and machine learning classification for mobile security
US20130117849A1 (en) 2011-11-03 2013-05-09 Ali Golshan Systems and Methods for Virtualized Malware Detection
US20150244732A1 (en) 2011-11-03 2015-08-27 Cyphort Inc. Systems And Methods For Malware Detection And Mitigation
US20130227691A1 (en) 2012-02-24 2013-08-29 Ashar Aziz Detecting Malicious Network Content
US20130276114A1 (en) 2012-02-29 2013-10-17 Sourcefire, Inc. Method and apparatus for retroactively detecting malicious or otherwise undesirable software
US20130298244A1 (en) 2012-05-01 2013-11-07 Taasera, Inc. Systems and methods for threat identification and remediation
US20150135262A1 (en) 2012-05-03 2015-05-14 Shine Security Ltd. Detection and prevention for malicious threats
US20140090061A1 (en) 2012-09-26 2014-03-27 Northrop Grumman Systems Corporation System and method for automated machine-learning, zero-day malware detection
US20140096251A1 (en) 2012-09-28 2014-04-03 Level 3 Communications, Llc Apparatus, system and method for identifying and mitigating malicious network threats
US20150007312A1 (en) 2013-06-28 2015-01-01 Vinay Pidathala System and method for detecting malicious links in electronic messages
US20150106927A1 (en) 2013-10-14 2015-04-16 Ut-Battelle, Llc Real-time detection and classification of anomalous events in streaming data
US20150180883A1 (en) 2013-10-22 2015-06-25 Erdem Aktas Control flow graph representation and classification
US20150128263A1 (en) 2013-11-07 2015-05-07 Cyberpoint International, LLC Methods and systems for malware detection
US20150172300A1 (en) 2013-12-17 2015-06-18 Hoplite Industries, Inc. Behavioral model based malware protection system and method
US20150244730A1 (en) 2014-02-24 2015-08-27 Cyphort Inc. System And Method For Verifying And Detecting Malware
US20160065601A1 (en) 2014-02-24 2016-03-03 Cyphort Inc. System And Method For Detecting Lateral Movement And Data Exfiltration
US20160078229A1 (en) 2014-02-24 2016-03-17 Cyphort Inc. System And Method For Threat Risk Scoring Of Security Threats

Non-Patent Citations (25)

* Cited by examiner, † Cited by third party
Title
Bayer, Ulrich, "TTAnalyze: A Tool for Analyzing Malware", Master's Thesis, Technical University of Vienna, Dec. 12, 2005.
Cavallaro, Lorenzo et al., "Anti-Taint-Analysis: Practical Evasion Techniques Against Information Flow Based Malware Defense", Secure Systems Lab at Stony Brook University, Tech. Rep, pp. 1-18, Nov. 2007.
Chen, Xu et al. "Towards an Understanding of Anti-Virtualization and Anti-Debugging Behavior in Modem Malware," In Proceedings of the 38th Annual IEEE International Conference on Dependable Systems and Networks (DSN), Jun. 24, 2008.
Extended European Search Report in European Application No. 12844780.2, dated Jul. 22, 2015.
Extended European Search Report in European Application No. 12845692.8, dated Oct. 8, 2015.
Extended European Search Report in European Application No. 16167215.9, dated Sep. 28, 2016.
Ho, Alex et al., "Practical Taint-Based Protection Using Demand Emulation", EuroSys '06, Proceedings of the 1st ACM SIGOPS/EuroSys European Conference on Computer Systems 2006, vol. 40, Issue 4, pp. 29-41, Oct. 2006.
International Application No. PCT/US2012/063566, International Search Report and Written Opinion mailed Jan. 22, 2013.
International Application No. PCT/US2012/063569, International Search Report and Written Opinion mailed Jan. 18, 2013.
International Preliminary Report on Patentability in International Application No. PCT/US2012/063566, mailed May 15, 2014.
International Preliminary Report on Patentability in International Application No. PCT/US2012/063569, mailed May 15, 2014.
International Preliminary Report on Patentability in International Application No. PCT/US2015/017389, mailed Sep. 9, 2016.
International Preliminary Report on Patentability in International Application No. PCT/US2015/017394, mailed Sep. 9, 2016.
International Search Report and Written Opinion in International Application No. PCT/US2015/017389, mailed Dec. 11, 2015.
International Search Report and Written Opinion in International Application No. PCT/US2015/017394, mailed Jun. 10, 2015.
Kang, Min Gyung et al. "Emulating Emulation-Resistant Malware," In Proceedings of the Workshop on Virtual Machine Security (VMSec), Nov. 9, 2009.
Office Action in U.S. Appl. No. 13/288,917, mailed Dec. 3, 2014.
Office Action in U.S. Appl. No. 13/288,917, mailed Jul. 13, 2015.
Office Action in U.S. Appl. No. 13/288,917, mailed Jun. 1, 2016.
Office Action in U.S. Appl. No. 13/288,917, mailed Jun. 30, 2014.
Office Action in U.S. Appl. No. 13/288,917, mailed Sep. 12, 2013.
Office Action in U.S. Appl. No. 14/629,435, mailed Apr. 1, 2016.
Office Action in U.S. Appl. No. 14/629,444, mailed Feb. 9, 2016.
Office Action in U.S. Appl. No. 14/629,444, mailed Jun. 9, 2016.
Zhang, Xiao-Song et al., "A Practical Taint-Based Malware Detection", ICACIA 2008, International Conference on Apperceiving Computing and Intelligence Analysis, pp. 73-77, Dec. 2008.

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9792430B2 (en) 2011-11-03 2017-10-17 Cyphort Inc. Systems and methods for virtualized malware detection
US10721243B2 (en) * 2012-09-28 2020-07-21 Level 3 Communications, Llc Apparatus, system and method for identifying and mitigating malicious network threats
US20190104136A1 (en) * 2012-09-28 2019-04-04 Level 3 Communications, Llc Apparatus, system and method for identifying and mitigating malicious network threats
US10335738B1 (en) * 2013-06-24 2019-07-02 Fireeye, Inc. System and method for detecting time-bomb malware
US10867041B2 (en) 2013-07-30 2020-12-15 Palo Alto Networks, Inc. Static and dynamic security analysis of apps for mobile devices
US10237296B2 (en) * 2014-01-27 2019-03-19 Cronus Cyber Technologies Ltd Automated penetration testing device, method and system
US20160352771A1 (en) * 2014-01-27 2016-12-01 Cronus Cyber Technologies Ltd Automated penetration testing device, method and system
US10225280B2 (en) 2014-02-24 2019-03-05 Cyphort Inc. System and method for verifying and detecting malware
US11405410B2 (en) 2014-02-24 2022-08-02 Cyphort Inc. System and method for detecting lateral movement and data exfiltration
US11902303B2 (en) 2014-02-24 2024-02-13 Juniper Networks, Inc. System and method for detecting lateral movement and data exfiltration
US10095866B2 (en) 2014-02-24 2018-10-09 Cyphort Inc. System and method for threat risk scoring of security threats
US10326778B2 (en) 2014-02-24 2019-06-18 Cyphort Inc. System and method for detecting lateral movement and data exfiltration
US10726119B2 (en) * 2014-12-08 2020-07-28 Vmware, Inc. Monitoring application execution in a clone of a virtual computing instance for application whitelisting
US20160162685A1 (en) * 2014-12-08 2016-06-09 Vmware, Inc. Monitoring application execution in a clone of a virtual computing instance for application whitelisting
US11036859B2 (en) 2014-12-18 2021-06-15 Palo Alto Networks, Inc. Collecting algorithmically generated domains
US10846404B1 (en) 2014-12-18 2020-11-24 Palo Alto Networks, Inc. Collecting algorithmically generated domains
US10581898B1 (en) * 2015-12-30 2020-03-03 Fireeye, Inc. Malicious message analysis system
US10050998B1 (en) * 2015-12-30 2018-08-14 Fireeye, Inc. Malicious message analysis system
US10581874B1 (en) * 2015-12-31 2020-03-03 Fireeye, Inc. Malware detection system with contextual analysis
US10061924B1 (en) * 2015-12-31 2018-08-28 Symantec Corporation Detecting malicious code based on deviations in executable image import resolutions and load patterns
US10509729B2 (en) 2016-01-13 2019-12-17 Intel Corporation Address translation for scalable virtualization of input/output devices
US10228981B2 (en) * 2017-05-02 2019-03-12 Intel Corporation High-performance input-output devices supporting scalable virtualization
US11656916B2 (en) 2017-05-02 2023-05-23 Intel Corporation High-performance input-output devices supporting scalable virtualization
US11055147B2 (en) 2017-05-02 2021-07-06 Intel Corporation High-performance input-output devices supporting scalable virtualization
US10268595B1 (en) 2017-10-24 2019-04-23 Red Hat, Inc. Emulating page modification logging for a nested hypervisor
US11184369B2 (en) * 2017-11-13 2021-11-23 Vectra Networks, Inc. Malicious relay and jump-system detection using behavioral indicators of actors
US11616808B2 (en) 2017-12-01 2023-03-28 At&T Intellectual Property I, L.P. Counter intelligence bot
US10785258B2 (en) 2017-12-01 2020-09-22 At&T Intellectual Property I, L.P. Counter intelligence bot
US11010474B2 (en) 2018-06-29 2021-05-18 Palo Alto Networks, Inc. Dynamic analysis techniques for applications
US10956573B2 (en) * 2018-06-29 2021-03-23 Palo Alto Networks, Inc. Dynamic analysis techniques for applications
US20200004963A1 (en) * 2018-06-29 2020-01-02 Palo Alto Networks, Inc. Dynamic analysis techniques for applications
US11620383B2 (en) 2018-06-29 2023-04-04 Palo Alto Networks, Inc. Dynamic analysis techniques for applications
US11604878B2 (en) 2018-06-29 2023-03-14 Palo Alto Networks, Inc. Dynamic analysis techniques for applications
US10853500B2 (en) 2018-08-06 2020-12-01 Xerox Corporation Method and system for identifying virtualized applications
US11528288B2 (en) 2018-11-30 2022-12-13 Ovh Service infrastructure and methods of predicting and detecting potential anomalies at the service infrastructure
US11663032B2 (en) 2019-01-28 2023-05-30 Orca Security LTD. Techniques for securing virtual machines by application use analysis
US11740926B2 (en) 2019-01-28 2023-08-29 Orca Security LTD. Techniques for securing virtual machines by analyzing data for cyber threats
US11516231B2 (en) 2019-01-28 2022-11-29 Orca Security LTD. Techniques for securing virtual machines
US11663031B2 (en) 2019-01-28 2023-05-30 Orca Security LTD. Techniques for securing virtual cloud assets at rest against cyber threats
US11431735B2 (en) 2019-01-28 2022-08-30 Orca Security LTD. Techniques for securing virtual machines
US11868798B2 (en) 2019-01-28 2024-01-09 Orca Security LTD. Techniques for securing virtual machines
US11775326B2 (en) 2019-01-28 2023-10-03 Orca Security LTD. Techniques for securing a plurality of virtual machines in a cloud computing environment
US11693685B2 (en) 2019-01-28 2023-07-04 Orca Security LTD. Virtual machine vulnerabilities and sensitive data analysis and detection
US11726809B2 (en) 2019-01-28 2023-08-15 Orca Security LTD. Techniques for securing virtual machines by application existence analysis
US11706251B2 (en) 2019-09-13 2023-07-18 Palo Alto Networks, Inc. Simulating user interactions for malware analysis
US11196765B2 (en) 2019-09-13 2021-12-07 Palo Alto Networks, Inc. Simulating user interactions for malware analysis
US11689560B2 (en) 2019-11-25 2023-06-27 Cisco Technology, Inc. Network-wide malware mapping
US11669263B1 (en) 2021-03-24 2023-06-06 Meta Platforms, Inc. Systems and methods for low-overhead and flexible trace capture triggered on-demand by hardware events under production load
RU2791824C1 (en) * 2022-02-14 2023-03-13 Групп-Ай Би Глобал Прайвет Лимитед Method and computing device for detecting target malicious web resource

Also Published As

Publication number Publication date
EP2774039A4 (en) 2015-11-11
EP3093762B1 (en) 2020-01-08
EP3093762A1 (en) 2016-11-16
EP2774038A4 (en) 2015-08-19
US20130117848A1 (en) 2013-05-09
WO2013067508A1 (en) 2013-05-10
EP2774038B1 (en) 2016-06-22
EP2774038A1 (en) 2014-09-10
CA2854183A1 (en) 2013-05-10
EP2774039A1 (en) 2014-09-10
WO2013067505A1 (en) 2013-05-10
EP2774039B1 (en) 2019-09-11
CA2854182A1 (en) 2013-05-10

Similar Documents

Publication Publication Date Title
US9519781B2 (en) Systems and methods for virtualization and emulation assisted malware detection
US9792430B2 (en) Systems and methods for virtualized malware detection
US10095866B2 (en) System and method for threat risk scoring of security threats
US10216931B2 (en) Detecting an attempt to exploit a memory allocation vulnerability
US20130232576A1 (en) Systems and methods for cyber-threat detection
US9548990B2 (en) Detecting a heap spray attack
US10505975B2 (en) Automatic repair of corrupt files for a detonation engine
CN108369541B (en) System and method for threat risk scoring of security threats
Hatem et al. Malware detection in cloud computing
Megira et al. Malware analysis and detection using reverse engineering technique
US20160269443A1 (en) Exploit detection based on heap spray detection
US11636208B2 (en) Generating models for performing inline malware detection
US20210200859A1 (en) Malware detection by a sandbox service by utilizing contextual information
US20210021611A1 (en) Inline malware detection
US10601867B2 (en) Attack content analysis program, attack content analysis method, and attack content analysis apparatus
WO2021015941A1 (en) Inline malware detection
Kührer et al. Cloudsylla: Detecting suspicious system calls in the cloud
US20230244787A1 (en) System and method for detecting exploit including shellcode
US20220245249A1 (en) Specific file detection baked into machine learning pipelines
Afonso et al. A hybrid system for analysis and detection of web-based client-side malicious code
Jayarathna et al. Hypervisor-based Security Architecture to Protect Web Applications.
Sridhar Testbed Design For Evaluation Of Active Cyber Defense Systems
Badr et al. Malware Detection in Cloud Environment
Shukla et al. VAPT & Exploits, along with Classification of Exploits

Legal Events

Date Code Title Description
AS Assignment

Owner name: CYPHORT INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GOLSHAN, ALI;BINDER, JAMES S.;REEL/FRAME:028068/0436

Effective date: 20120418

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.)

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4