USA 2012 Scientific but Not Academical Overview of Malware - - PowerPoint PPT Presentation

usa 2012
SMART_READER_LITE
LIVE PREVIEW

USA 2012 Scientific but Not Academical Overview of Malware - - PowerPoint PPT Presentation

USA 2012 Scientific but Not Academical Overview of Malware Anti-Debugging, Anti-Disassembly and Anti-VM Technologies Rodrigo Rubira Branco (@BSDaemon) Rodrigo Rubira Branco (@BSDaemon) Gabriel Negreira Barbosa (@gabrielnb) Gabriel Negr eira


slide-1
SLIDE 1

USA2012

Scientific but Not Academical Overview of Malware Anti-Debugging, Anti-Disassembly and Anti-VM Technologies

Rodrigo Rubira Branco (@BSDaemon) Rodrigo Rubira Branco (@BSDaemon) Gabriel Negr Gabriel Negreira Barbosa (@gabrielnb) eira Barbosa (@gabrielnb) Pedr Pedro Drimel Neto (@pdrimel)

  • Drimel Neto (@pdrimel)
slide-2
SLIDE 2

Were we live…

2 BLACK HAT USA 2012

slide-3
SLIDE 3

BLACK HAT USA 2012

Nah, we live actually here…

slide-4
SLIDE 4

BLACK HAT USA 2012

São Paulo

slide-5
SLIDE 5

Agenda

5 BLACK HAT USA 2012

l Introduction / Motivation l Objectives l Methodology l Dissect || PE Project l Executive Summary l Techniques l Resources l Conclusions

slide-6
SLIDE 6

BLACK HAT USA 2012

Introduc)on ¡/ ¡Mo)va)on ¡

l Hundreds of thousands of new samples every week l Still, automation is about single tasks or single analysis l Presentations still pointing tens of thousands in tests (what

about the millions of samples?)

l Companies promote research which uses words such as

‘many’ instead of X number

slide-7
SLIDE 7

BLACK HAT USA 2012

Before ¡con)nue, ¡some ¡defini)ons ¡... ¡

l Anti-Debugging

l

Techniques to compromise debuggers and/or the debugging process

l Anti-Disassembly

l

Techniques to compromise disassemblers and/or the disassembling process

l Obfuscation

l

Techniques to make the signatures creation more difficult and the disassembled code harder to be analyzed by a professional

l Anti-VM:

l

Techniques to detect and/or compromise virtual machines

slide-8
SLIDE 8

BLACK HAT USA 2012

Objec)ves ¡

l Analyze millions of malware samples l Share the current results related to:

l

Anti-Debugging

l

Anti-Disassembly

l

Obfuscation

l

Anti-VM

l Keep sharing more and better results in our portal

(www.dissect.pe):

l

New malware samples are always being analyzed

l

Detection algorithms are constantly being improved

l

The system does not analyze only anti-RE things

slide-9
SLIDE 9

BLACK HAT USA 2012

Dissect ¡|| ¡PE ¡Project ¡

l Scalable and flexible automated malware analysis system l Receives malware from trusted partners l Portal available for partners, researchers and general media

with analysis data

slide-10
SLIDE 10

BLACK HAT USA 2012

Dissect ¡|| ¡PE ¡– ¡Overview ¡

l Free research malware analysis system for the community

l

Open architecture

l

Works with plugins

l 10 dedicated machines distributed in 3 sites:

l

2 sites in Brazil (São Paulo and Bauru cities)

l

1 site in Germany

l Some numbers:

l

Receives more than 150 GB of malwares per day

l

More than 30 million unique samples

slide-11
SLIDE 11

BLACK HAT USA 2012

Dissect ¡|| ¡PE ¡– ¡Partners ¡

slide-12
SLIDE 12

BLACK HAT USA 2012

Dissect ¡|| ¡PE ¡– ¡Backend ¡

l Each backend downloads samples scheduled for analysis

(our scheduler algorithms are documented in a IEEE Malware2011 paper)

l Analyze samples

l

Both static and dynamic analysis currently supported

l Analysis results accessible from the portal

l

Sync'ed back from the backend

l Some characteristics:

l

Plugins

l

Network traffic

l

Unpacked version of the malware

slide-13
SLIDE 13

BLACK HAT USA 2012

Dissect ¡|| ¡PE ¡– ¡Plugins ¡

l Samples are analyzed by independent applications named

“plugins”

l Easy to add and/or remove plugins

l

Just a matter of copy and remove their files

l Language independent l Easy to write new plugins:

l

Needed information come as arguments

  • We usually create handlers so the researcher does not need to change his actual

code

l

Simply print the result to stdout

  • The backend takes care of parsing it accordingly
slide-14
SLIDE 14

BLACK HAT USA 2012

Dissect ¡|| ¡PE ¡– ¡Plugin ¡Examples ¡

l Python

print “My plugin result.”

l C

#include <stdio.h> int main(int argc, char **argv) {

printf(“My plugin result.\n”); return 1;

}

slide-15
SLIDE 15

BLACK HAT USA 2012

Dissect ¡|| ¡PE ¡– ¡Plugin ¡Types ¡

l Static:

l

Usually executed outside of the VM (we already have an exception for the unpacking plugin)

l

Failsafe: errors do not compromise the system

l

Might get executed in one of two different situations depending on where we copied the plugin:

  • Before the malware is executed
  • After the malware was executed

l Dynamic:

l

Executed inside a Windows system (for now the only supported OS, soon others)

slide-16
SLIDE 16

BLACK HAT USA 2012

Dissect ¡|| ¡PE ¡– ¡Network ¡Traffic ¡

l During dynamic analysis all the network traffic is captured l Pcap available at the portal l Dissectors:

l

Analyze the pcap and print the contents in a user-friendly way

l

Supporting IRC, P2P, HTTP, DNS and other protocols

l

SSL inspection (pre-loaded keys)

slide-17
SLIDE 17

BLACK HAT USA 2012

Methodology ¡

l Used a total of 72 cores and 100 GB of memory l Analyzed only 32-bit PE samples l Packed samples:

l

Different samples using the same packer were counted as 1 unique sample

  • So, each sample was analyzed once

l

Analyzed all packers present among the 4 million samples

l Unpacked samples:

l

Avoided samples bigger than 3,9 MB for performance reasons (with some exceptions such as the Flame Malware)

slide-18
SLIDE 18

BLACK HAT USA 2012

Methodology ¡

l Static analysis:

l

Main focus of this presentation

l

Improves the throughput (with well-written code)

l

Not detectable by malwares

l Dynamic counter-part:

l

It is not viable to statically detect everything

l

Already developed and deployed, but is not covered by this presentation

  • The related results can be found at https://www.dissect.pe
slide-19
SLIDE 19

BLACK HAT USA 2012

Methodology ¡

l Malware protection techniques in this work:

l

State-of-the-art papers/journals

l

Malwares in the wild

l

Some techniques we documented are not yet covered by our system:

  • The system is constantly being updated

l

All techniques were implemented even when there were no public examples of it (github)

l Our testbed comprises 883 samples to:

l

Detect bugs

l

Performance measurement

l

Technique coverage

slide-20
SLIDE 20

BLACK HAT USA 2012

Methodology ¡

l Possible techniques detection results:

l

Detected:

  • Current detection algorithms detected the malware protection technique

l

Not detected:

  • Current detection algorithms did not detect the malware protection technique

l

Evidence detected:

  • Current detection algorithms could not deterministically detect the protection

technique, but some evidences were found

slide-21
SLIDE 21

BLACK HAT USA 2012

Methodology ¡

l Analysis rely on executable sections and in the entrypoint

  • ne

l

Decreases the probability to analyze data as code

l

Improves even more the analysis time

l

For now we miss non-executable areas, even if they are referred by analyzed sections (future work will cover this)

l Disassembly-related analysis framework:

l

Facilitates the development of disassembly analysis code

l

Speeds up the disassembly process for plugins

l

Calls-back the plugins for specific instruction types

l

Disassembly once, analyze all

l

Care must be taken to avoid disassembly attacks

slide-22
SLIDE 22

BLACK HAT USA 2012

Execu)ve ¡Summary ¡

slide-23
SLIDE 23

BLACK HAT USA 2012

Packed ¡vs ¡Not ¡Packed ¡

slide-24
SLIDE 24

BLACK HAT USA 2012

Top ¡Packers ¡

slide-25
SLIDE 25

BLACK HAT USA 2012

Malware ¡Targe)ng ¡Brazilian ¡Banks ¡

slide-26
SLIDE 26

BLACK HAT USA 2012

Protec)ng ¡Mechanisms ¡of ¡Packers ¡

Paper (yes, we wrote one...)

slide-27
SLIDE 27

BLACK HAT USA 2012

Protected ¡Samples ¡

slide-28
SLIDE 28

BLACK HAT USA 2012

An)-­‑RE ¡Categories ¡

slide-29
SLIDE 29

BLACK HAT USA 2012

An)-­‑Disassembly ¡

slide-30
SLIDE 30

BLACK HAT USA 2012

An)-­‑Debugging ¡

slide-31
SLIDE 31

BLACK HAT USA 2012

Obfusca)on ¡

slide-32
SLIDE 32

BLACK HAT USA 2012

An)-­‑VM ¡

slide-33
SLIDE 33

BLACK HAT USA 2012

An)-­‑Debugging ¡Techniques ¡

l Studied and documented 33 techniques l Currently scanning samples for 30 techniques

l

Detected: Marked in green

l

Evidence: Marked in yellow

l

Not covered: Marked in black

slide-34
SLIDE 34

BLACK HAT USA 2012

An)-­‑Debugging ¡Techniques ¡

l

PEB NtGlobalFlag (Section 3.1)

l

IsDebuggerPresent (Section 3.2)

l

CheckRemoteDebuggerPresent (Section 3.3)

l

Heap flags (Section 3.4)

l

NtQueryInformationProcess – ProcessDebugPort (Section 3.5)

l

Debug Objects – ProcessDebugObjectHandle Class (Section 3.6)

l

Debug Objects – ProcessDebugFlags Class [1] (Section 3.7)

l

NtQuerySystemInformation – SystemKernelDebuggerInformation (Section 3.8)

l

OpenProcess – SeDebugPrivilege (Section 3.9)

l

Alternative Desktop (Section 3.10)

slide-35
SLIDE 35

BLACK HAT USA 2012

An)-­‑Debugging ¡Techniques ¡

l

Self-Debugging (Section 3.11)

l

RtlQueryProcessDebugInformation (Section 3.12)

l

Hardware Breakpoints (Section 3.13)

l

OutputDebugString (Section 3.14)

l

BlockInput (Section 3.15)

l

Parent Process (Section 3.16)

l

Device Names (Section 3.17)

l

OllyDbg – OutputDebugString (Section 3.18)

l

FindWindow (Section 3.19)

l

SuspendThread (Section 3.20)

slide-36
SLIDE 36

BLACK HAT USA 2012

An)-­‑Debugging ¡Techniques ¡

l

SoftICE – Interrupt 1 (Section 3.21)

l

SS register (Section 3.22)

l

UnhandledExceptionFilter (Section 3.23)

l

Guard Pages (Section 3.24)

l

Execution Timing (Section 3.25)

l

Software Breakpoint Detection (Section 3.26)

l

Thread Hiding (Section 3.27)

l

NtSetDebugFilterState (Section 3.28)

l

Instruction Counting (Section 3.29)

l

Header Entrypoint (Section 3.30)

l

Self-Execution (Section 3.31)

slide-37
SLIDE 37

BLACK HAT USA 2012

An)-­‑Disassembly ¡Techniques ¡

l Studied and documented 9 techniques and variations l Currently scanning samples for 8 techniques and variations

l

Detected: Marked in green

l

Evidence: Marked in yellow

l

Not covered: Marked in black

slide-38
SLIDE 38

BLACK HAT USA 2012

An)-­‑Disassembly ¡Techniques ¡

l

Garbage Bytes (Section 4.2.1)

l

Program Control Flow Change (Section 4.2.2)

l

Direct approach

l

Indirect approach

l

Fake Conditional Jumps (Section 4.2.3)

l

XOR variation

l

STC variation

l

CLC variation

l

Call Trick (Section 4.2.4)

l

Flow Redirection to the Middle of an Instruction (Section 4.2.5)

l

Redirection into other instructions

l

Redirection into itself

slide-39
SLIDE 39

BLACK HAT USA 2012

Obfusca)on ¡Techniques ¡

l Studied and documented 14 techniques and variations l Currently scanning samples for 7 techniques and

variations

l

Detected: Marked in green

l

Evidence: Marked in yellow

l

Not covered: Marked in black

slide-40
SLIDE 40

BLACK HAT USA 2012

Obfusca)on ¡Techniques ¡

l Push Pop Math (Section 4.3.1) l NOP Sequence (Section 4.3.2) l Instruction Substitution (Section 4.3.3)

l

JMP variation

l

MOV variation

l

XOR variation

l

JMP variation (Push Ret)

l Code Transposition (Section 4.3.4)

l

Program control flow forcing variation

l

Independent instructions reordering variation

slide-41
SLIDE 41

BLACK HAT USA 2012

Obfusca)on ¡Techniques ¡

l Register Reassignment (Section 4.3.5) l Code Integration (Section 4.3.6) l Fake Code Insertion (Section 4.3.7) l PEB->Ldr Address Resolving (Section 4.3.8) l Stealth Import of the Windows API (Section 4.3.9) l Function Call Obfuscation (Section 4.3.10)

slide-42
SLIDE 42

BLACK HAT USA 2012

An)-­‑VM ¡Techniques ¡

l Studied and documented 7 techniques and variations l Currently scanning samples for 6 techniques and

variations

l

Detected: Marked in green

l

Evidence: Marked in yellow

l

Not covered: Marked in black

slide-43
SLIDE 43

BLACK HAT USA 2012

An)-­‑VM ¡Techniques ¡

l CPU Instructions Results Comparison (Section 5.1)

l

SIDT approach

l

SLDT approach

l

SGDT approach

l

STR approach

l

SMSW approach

l VMWare – IN Instruction (Section 5.2) l VirtualPC – Invalid Instruction (Section 5.3)

slide-44
SLIDE 44

BLACK HAT USA 2012

New ¡Techniques ¡

l

We understand that malware is quickly evolving, thus there is a need for analysis to go at least as fast

l

SSEXY

l

SSE obfuscation tool released in Hack in The Box Amsterdam (17-20 of May) by Jurriaan Bremer

l

In June we already had a plugin to detect it

l

Flame

l

The industry positioned it as completely new, embedding a LUA interpreter for rapid development of new capabilities

l

We implemented a plugin for the detection of embedded LUA as soon as the news came

  • ut and we can TELL you that there is no other malware containing LUA
  • We do not have to assume it as we have analysis results
slide-45
SLIDE 45

BLACK HAT USA 2012

New ¡Techniques ¡

l On July, 25, Morgan Marquis-Boire and Bill Marczak

released a paper about the FinFisher Spy Kit. Their paper mention many protection techniques used by the code:

l

A piece of code for crashing OllyDBG

l

DbgBreakPoint Overwrite (Covered in Section 3.33)

l

IsDebuggerPresent (Covered in Section 3.2)

l

Thread Hiding (Covered in Section 3.27)

l

Debug Objects - ProcessDebugObjectHandle Class (Covered in Section 3.6)

slide-46
SLIDE 46

BLACK HAT USA 2012

Resources ¡

l Sample code for the different techniques we detect are

available on github:

l

https://github.com/rrbranco/blackhat2012

  • We will open the repository just after the conference

l Updated versions of the paper and presentation are going to

be available at:

l

http://research.dissect.pe

slide-47
SLIDE 47

BLACK HAT USA 2012

Resources ¡– ¡Portal ¡Demo ¡

l Portal URL: http://www.dissect.pe l Any interested researcher / contributor / journalist can have

access to the portal (drop us an email or come to the Qualys booth)

l We are constantly updating the statistics and developing/

improving analysis algorithms

slide-48
SLIDE 48

BLACK HAT USA 2012

Conclusions ¡

l We analyzed millions of malware samples and showed

scientific results about their usage of protection techniques

l There are more techniques to implement and some

algorithms to improve

l

We still have a lot to do... and so do you! Help us!

l The portal (www.dissect.pe) is always updated with new and

better results:

l

More detection techniques

l

More analyzed samples

slide-49
SLIDE 49

BLACK HAT USA 2012

Acknowledges ¡

l Ronaldo Pinheiro de Lima – Joined our team a bit later in the

research process, but gave amazing contributions!

l Peter Ferrier – Amazing papers, great feedback/discussions

by email

l Jurriaan Bremer – SSEXY l Reversing Labs - TitaniumCore

slide-50
SLIDE 50

USA2012

THE END ! Really !?

Rodrigo Rubira Branco (@BSDaemon) Gabriel Negreira Barbosa (@gabrielnb) Pedro Drimel Neto (@pdrimel) {rbranco,gbarbosa,pdrimel} *noSPAM* qualys.com