Monday, February 25, 2008

Impact analysis

I recently co-taught a class on using helix for incident response and forensics. During the course I found myself thinking back to past conversations and papers I've read on the subject of impact and impact analysis...

It's well known that we have an impact on a system, and it's perhaps even better known that we have the ability to be the single greatest force that gets exerted on a system while it's up and running.

For the purposes of this entry I am referring to the use of WFT and the monolithic configuration file that unnecessarily uses multiple tools that use the same API's and same function calls on helix. First, a quick discussion about this..if two tools use the same API and the same function calls to collect data, it's pretty safe to say that we will collect the same information from the execution of either tool barring some form of subversion or direct attack against a specific tool.

Since WFT in Helix (1.9) uses them I'll refer to: Dumpel and psloglist here.

Dumpel comes to us from the Windows 2000 resource kit. psloglist comes from sysinternals. Both tools have the ability to extract event logs from a system. Psloglist is definitely more feature rich but both accomplish the same thing when run locally against a system.

To do a comparison I used pe explorer and extracted the dependencies and PE import tables of each.

It's a little burdensome to read it all and post the complete diff here but here's a screenshot. Let's just say that the only differences in the dependency files are what I'm showing.



Ok, so the dependencies are the same. How about the function calls from each? The main functions called to open, read and close event logs for both tools comes out of advapi32.dll,kernel32.dll and user32.dll. They both use the exact same libraries (psloglist uses more because it's more feature rich) which is no real surprise.

After limited testing so far it seems that the tools will provide the same results. So my question is...why run both tools? Why the need to execute the same function calls from two tools?


If a large part of response is about minimizing impact (since doing no harm is not possible) then there is no good reason I can think of to run two tools that use the same API and function calls. We use more memory, and have a greater impact than is required on a system. More specifically there is a problem with helix's use of WFT and it's simply that it seems to treat every problem as a nail when in fact there is no need to run all of these extraneous tools that increase our impact when there is just no good technical need. This all goes to knowing your tools and understanding what they do and how they work.

What are your thoughts?

7 comments:

Lance Mueller said...

Great points and excellent illustration. I think many of the automated IR collection tools by default like to throw the kitchen sink at the problem and subscribe to "more is better". In addition, it seems to me that a lot of people just tend to use the "canned" versions of these tools rather than tweaking them and customizing what is collected because its easier and less work.

I agree with the use more memory statment, but would also state that order of volitility comes into play and could negate some of that concern.

H. Carvey said...

While I like to have options available when it comes to tools, in most cases, I neither see the need nor recommend that folks use two (or more) tools that rely on the same API function calls to do the same thing. After all, one of the ways to detect things like rootkits is to use some sort of differential analysis, and the only real way to do that is to try to use two (or more) different techniques (function calls) to collect the same data.

I don't think I agree with the statement about using more memory right off the bat. When the first tool is run, memory that isn't already allocated is used...when the process completes, the memory is freed. When the next tool is run, at least some of that recently freed memory would likely be used.

Helix is a good tool in the sense that it provides an interface for folks who wouldn't normally do any sort of volatile data collection on Windows systems to actually do something...this is the first step in the genesis towards doing actual IR. Yes, there are extraneous tools, but hey, it's a start, and if it gets Joe Sysadmin to collect volatile data when under normal circumstances he wouldn't...well, that's something.

hogfly said...

Great comments guys.

What I think people tend to find misleading is that they see Helix and they see WFT and the assumption is made that the tool is pre-configured to follow best practice since "some forensics guys put the cd together".

Harlan I think I actually deleted something I intended to put in the post which affirms what you're saying about rootkit detection. It was something to the effect of "if you're going to run multiple tools to perform the same duty, use tools that don't use the same API or use an API-agnostic tool like Carvey's event log parsing scripts".

As for using more memory, we certainly do use more memory, not stating that we're destroying something of value necessarily, but that we're simply creating a larger imprint than is required to do the job. And, if not memory, then the file system certainly suffers if we run some combination of tools when we don't need to.

Anonymous said...

> To do a comparison I used pe explorer

Could you please elaborate a bit more what did you use?

hogfly said...

Anonymous. I used PE Explorer from heaventools.com to look at import/export tables, and traced down dll's used and dependencies.

H. Carvey said...

Hogfly,

...or use an API-agnostic tool...

How do you do that on a Windows system? Even most (not all) rootkit detection tools query data using both high- and low-level APIs, and then 'diff' the output.

I have seen folks in the past claim that certain tools didn't use any API function calls, but just the fact that those tools have populated import tables...well, you get my point...

...then the file system certainly suffers...

Well, that sort of depends on the tools, doesn't it? There are tools...like 'dir'...that will get data for your without modifying the file system, or even file metadata.

Anonymous said...

One thing to take into account is tools like WFT lowers the bar of entry. Normal admins, ones who have a hard time adding users, can put in the CD and collect important data using a nice GUI. Additionally, I have seen where some tools do fail when running wft , and when you have inexperienced admins running this tool I would rather have too much info then none at all. Far too often by the time the CIRT gets onsite they have trampled the data so we have them run WFT asap.

In the past I have used WFT a lot, but as I have gained experience I have came to a similar conclusion about the tool.

I have began testing IRCR, and made my own script that uses the same tools helix does. Its much cleaner from a system impact perspective.