Archive for the Research Category

[Release] ModularUrl Java Class

Posted in Research, Tools on July 30, 2010 by Skyler

So in the development of my Web Application fuzzer, I came upon the challenge of creating test cases from enumerated URLs. After fussing (play on words intended) around with some chunky logic, I had a palm-to-face moment. Perhaps it was that I had recently spent too much time in “procedural-language land”, but then the obvious object-oriented approach hit me.

The result was a quick creation of two java classes that would allow me to easily manipulate URLs and their parameters; essentially these classes do all the parsing for you. They are very simple, but quite effective. Perhaps it was the contrast of frustration to such a simple fix, but I feel like the release of this code could simply not wait to be released with my fuzzer.

UPDATED: The code here has been picked up by softpedia! you can get it here!

Here it is on sourceforge

And in typical Security Reliks fashion, a nasty copy&past version:

import java.util.ArrayList;
public class ModularUrl {
String base;
ArrayList<ModularUrlParameter> params;
public ModularUrl(String url){
String[] baseSplit = url.split(“[?]”);
base = baseSplit[0];
//split parameters up
String[] paramSplit = baseSplit[1].split(“&”);
params = new ArrayList<ModularUrlParameter>();
for(int i = 0; i < paramSplit.length; i++){
params.add(new ModularUrlParameter(paramSplit[i]));
}
}
public String getAllParametersAsString(){
StringBuilder result = new StringBuilder();

import java.util.ArrayList;
public class ModularUrl {
String base; ArrayList<ModularUrlParameter> params; public ModularUrl(String url){ String[] baseSplit = url.split(“[?]”); base = baseSplit[0]; //split parameters up String[] paramSplit = baseSplit[1].split(“&”); params = new ArrayList<ModularUrlParameter>(); for(int i = 0; i < paramSplit.length; i++){ params.add(new ModularUrlParameter(paramSplit[i])); } } public String getAllParametersAsString(){ StringBuilder result = new StringBuilder(); Continue reading

Hacking Web 2.0 with Firefox

Posted in Fu (a.k.a Tips), Research on July 30, 2010 by Skyler

Here is a great Symantec document relating to web 2.0 and its security vulnerabilities.

It outlines the security challenges related to testing, and then shows you some tools in Firefox to help do it.

Check it out here!

CovertChannelRelik v0.1 Released!

Posted in Research, Tools on July 12, 2010 by Skyler

A while ago I created a covert channel for a demonstration in a class of mine. I am now releasing the code for research purposes or those interested in seeing how one works.

I was simply trying to show how information could be passed in a completely “undetectable” manner, even if someone were actively sniffing your traffic. I ran my covert server & client, as well as Wireshark and showed how messages I was sending actually contained secret data undetectable to the human eye.

The way it works:

  • I entere a secret message to be sent and it is converted into binary.
  • I then begin sending “carrier messages”
  • the server separates the carrier message based on words. Each word’s ascii value is summed, and either left alone or appended with an invisible character. That sum is even or odd depending on the secrete message to be sent.
  • The client adds up the value of each word, and stores it as a 1, or 0. It slowly assembles the secret message as the carrier messages are received.

Go ahead and try it out. Just run the client, server and wireshark. Observe the communication and see how covert channels work!

The whole project isnt in exactly “practical” form, but it could easily be made into one. This is really just a “secret” chat client. If there is demand for this to be made more practical, I would be happy to do so. Until then, enjoy!

Download source here!

For those who just want to see the source for what makes the magic happen, here it is:

// encode msg to appropriate client-side equating of covert message
public CovertMsgBean encodeString(String msg){
StringBuffer encodedString = new StringBuffer(msg);
String[] msgList = msg.split(" ");
CovertMsgBean covertBean;
//iterates through the message, one word at a time. resents the encoding total every word.
for(int m = 0; m < msgList.length; m++){
encodeTotal = 0;
//iterates through that word one character at a time, then sums them up.
for(int i = 0; i < msgList[m].length(); i++){
encodeTotal += msgList[m].charAt(i);
}
//if we has sent the whole message, break.
if(covertSendingCounter == covertMsgBinary.length()){
for(int s = 0; s < (msgList.length - m); s++){
msgList[(msgList.length-1) - s] = " "; // might be s < m something
}
break;
}
//System.out.println("Encoded Message Total is " + encodeTotal);

//If the encoded total is correct for the next bit in the covert message's binary form, then its good...
if ((int)(encodeTotal % 2) == Integer.parseInt(String.valueOf(covertMsgBinary.charAt(covertSendingCounter)))){
//System.out.println("Current point in secret message: " + covertMsgBinary.charAt(covertSendingCounter)+ " and index " + covertSendingCounter);
} else {
//... if not, add the buffer to that specific message
msgList[m] = msgList[m] + DEFAULT_BUFFER;
}
covertSendingCounter++;
}
//wrap the array of words into one single string for sending
String listMsg = new String();
for(int j = 0; j < msgList.length; j++){
listMsg += msgList[j];
if(j != msgList.length){
listMsg += " ";
}
}
encodedString = new StringBuffer(listMsg);
covertBean = new CovertMsgBean(encodedString);
return covertBean;
}

“Netcat Is Your Friend” Case Study

Posted in Research, Reviews on July 10, 2010 by Skyler

For anyone interested in Incident Handling, or Penetration Testing case studies, I would recommend you read “Netcat Is Your Friend” (by T. Brian Granier). This is a case study included as part of the SANS GCIH course.

If you are like me, I love to read about clever hacking attack vectors, and the depth of their attacks. I find it MUCH more interesting than any ESPN highlight reel, or 6 o’clock news high speed chase.

It is a well written, and well structured paper. It gives you 100% examples of all code used, and explains it well. The second half of the paper then goes through the IH procedure. One thing that makes this paper a good reference is the “Jump Bag” list. I have made a blog post on Jump Bags in which I outline the list contained within “Netcat Is Your Friend”. You can find it here, or under the ‘tools’ section!

I recommend everyone at least take a look at the paper! I believe you will find it quite informative, and if you are anything like me, quite entertaining.

GRC vs. Wicked

Posted in Research on July 5, 2010 by Skyler

I just finished reading the story of Steve Gibson and his DDoS encounter.

I dont want to get into it, but I would definitely suggest everybody read it. The technology is a little older, but the scope and vision his research presents is truly remarkable.

The paper traces the story and research that came from a DDoS attack against his website. He goes on to dissect and reveal the entire bot infrastructure, and culture of those hacking circles.

It is a definite read! Download it here. [MD5 (grcdos.pdf) = cbb4a85fb81c005b327dffaef28fbee3]

I also suggest checking out his podcast, SecurityNow.

Web Fuzzing

Posted in Research, Tools on July 5, 2010 by Skyler

So I am currently working on creating a Web Application/Service Fuzzer. This seemed like an easy enough task, except for the fact that its never really been done yet… to the extent I’m aiming for.

There have been some tools out there for fuzzing. Some fully functional tools for Web Apps, like SPIKE from Immunity, however this isn’t automated. There are some sweet intelligent fuzzing frameworks like Peach, and Sulley, but these have the same limitations; in addition to not being aimed at the web. I found another sort of framework called RFuzz. Now this was more suited for the web, yet even less automated then the other tools.

My specifications make the project a bit more formidable. I need to create an intelligent fuzzer, which can be centrally hosted on an intranet, which is accessible via simple web request forms. I will explain my design, and its difficulties below:

My solution contains 3 different modules: A Fuzzing Engine, Web UI, and Enumerator/Crawler.

Fuzzing Engine: The engine I decided on was RFuzz. Because of its natural target of web applications, I felt it would be most accurate  for the job. Creating this modularly would involve creating a wrapper which would parse enumerated attack points and loop them through the fuzzing process. The output wrapper would then format the results into XML for easily integration into the Web UI; where users can review the results. I am basing most of this portion off work done by Rune Hammerland in his masters thesis. This portion of the project shouldn’t be too bad, except for the fact that I have to learn Ruby to do it. Good thing Ruby is pretty straight forward 😉

Enumerator/Crawler: There are a lot of web crawling libraries/tools out there; hpricot being a big one. The one weakness of hpricot is that it can’t crawl AJAX. You see, in AJAX you can essentially bind different dynamic states to nearly any HTML element you want. In normal static pages you can simply focus on the anchor (<a>..</a>) tags, and button elements. However, in AJAX the possibilites go far beyond that. Fortunately, I found a sweet project called Crawljax; written in Java which makes me happy. Crawljax can completely exercise any element on any webpage you decide; making it ideal for crawling AJAX pages. The hard part is outputting these results in a workable format. I have been having issues trying to figure out Crawljax’s output. It seemed straightforward at first:

config.setOutputFolder(“/tmp/”);

but I think perhaps its bugged.

Instead I might try wrapping the whole request in WebScarab and intercepting the requests. The benefit of this is the functionality of WebScarab libraries. Modifying parameter fields in WebScarab will be much easier that trying to parse them out of a Crawljax dump.

Doing the crawling of Web Services is simple. Direct the enumerator at the wsdl/wadl, and parse out the target fields. Easy enough.

After the whole page is crawled, the output then needs to be formatted to be sent over to the fuzzer.

Web UI: Easy enough. Just need a little Java interface to submit the requests.

Overall the project will be fun. If you have any ideas or input, please let me know. Im currently looking at a Java Enumerator/Crawler, a Ruby Fuzzer, and a HTML/PHP/Java Web UI.

Ill keep you updated!!