Friday, November 12, 2021

Satori Updates

 I've continued to update Satori little by little out there on github.  Both updating the underlying code and fingerprints.   Always happy to have new ideas or feedback on the program.

Saturday, February 23, 2019

Youtube channel

I've continued to make updates to the python version of satori and have put a lot of time in the past few weeks to updating fingerprints and fixing some minor bugs that have cropped up.  But.... I thought I'd take this opportunity to do a quick intro to the youtube channel I started.

It is called, you guessed it, Chatter on the Wire, or CotW for short.  It won't just be about OS Fingerprinting, but I'm sure I'll put some demo's of Satori and other programs I've written out there.  It will be videos about a number of different things, small electronics, coding projects, melting metals and whatever else I find myself periodically doing that I decided I'd document.  Mostly I believe it will be reviews of products I've been using myself or testing out for certain specific applications for home, work, or other areas of my life.

It will bounce around a bit and I hope to put stuff out every so often.  Not sure if that will be every few days, weeks, or months, but will all depend on time and the amount of effort it takes to put the videos together.

Sunday, November 11, 2018

posted to github

https://github.com/xnih/satori

3 modules so far:

  • dhcp
  • tcp
  • useragent

Speed increase of anywhere from 7.5 to 20.4 times faster than my old windows version.


I have not run all my old pcaps through it at this point, but did run a few from my last sans course through it to find a few bugs I had.

Fingerprint files for each of those 3 have been updated, with a number more hopefully coming with the next week.

I also hope to get smb done before long, as I do see there is an smb one in pypacker (missed it before).


Sunday, November 4, 2018

Satori rewrite update

Real life/work always gets in the way.....

Lost 2 weeks to work related fun, but finally had a chance to revisit the rewrite this weekend and parse some of the tcp pcaps I've gathered in the past month or so to feed it.

tcp.xml has been updated with a number of newer OS's.  Since it had been 4+ years or so since I'd last updated it, there were a lot of Windows OS's to add to it.  I did find it interesting that I can tell the difference between windows 10 - build 14393 (and potentially earlier) to 15063 and greater!  Something changed in the tcp stack between those builds.  Also added some 2012 and 2016 server fingerprints and a few others such as some newer OS X ones.  By no means complete with what I have, but when you're that far behind on things, takes awhile to start adding things again!

As for the rewrite itself, got some command line options so you can choose which modules you want to use (tcp is done, dhcp is next to be worked on).  Not sure what all ones I'll port, but moving along.  Also got code in place to choose the file you want to read in, instead of just having it hard coded.  Neither huge accomplishments, but something I had to get fixed to make it even partially usable.

I also got things broken out into different files, initially was all one big jumble, but hey, i'm not really a programmer, I hack stuff together to make it work! 

Hope to have something out on github for testing in the near future, but all depends on free time.

Sunday, October 14, 2018

TCP module done???

First the good news, the tcp module, including reading in the tcp.xml from years ago seems to be done.

I ran nmap -O 192.168.x.0/24 against my local home network and saved a tcpdump file to disk while it was running.  Of course I forgot about it and it was listening while any traffic it could see was going out to the internet as well.  Currently satori.py is hard coded to read one specific .pcap file, which I then read in:

python3 satori.py | awk -F';' '$7 != "" { print $3, $6, $7 }' | sort -u


Output:
192.168.1.108 8192:128:1:52:M1460,N,W8,N,N,S:A Windows 7:5|Windows Server 2008:5
192.168.1.109 29200:64:1:60:M1460,S,T,N,W7:. Ubuntu 18.x:5
192.168.1.131 8760:64:0:52:M1460,N,W0,N,N,S:A Hewlett-Packard JetDirect:5
192.168.25.128 29200:64:1:60:M1460,S,T,N,W7:. Ubuntu 18.x:5
198.8.71.207 65535:128:1:52:M1460,N,W1,N,N,S:A Windows 7:5
207.198.x.38 8192:128:1:52:M1460,N,W8,N,N,S:A Windows 7:5|Windows Server 2008:5
207.198.x.39 8192:128:1:52:M1460,N,W8,N,N,S:A Windows 7:5|Windows Server 2008:5
207.198.x.40 8192:128:1:52:M1460,N,W8,N,N,S:A Windows 7:5|Windows Server 2008:5
207.198.x.41 8192:128:1:52:M1460,N,W8,N,N,S:A Windows 7:5|Windows Server 2008:5
207.198.x.42 8192:128:1:52:M1460,N,W8,N,N,S:A Windows 7:5|Windows Server 2008:5
209.15.x.11 8192:128:1:52:M1460,N,W8,N,N,S:A Windows 7:5|Windows Server 2008:5
209.15.x.8 8192:128:1:52:M1460,N,W8,N,N,S:A Windows 7:5|Windows Server 2008:5


Format above is just src IP, tcp fingerprint, OS guess.   In true satori fashion, the OS guess has a value associated with it, so that I can tie the different ones together and give a guess based on different protocol fingerprints.  Format is OS Guess:Weight  |  next OS Guess:Its weight    etc....

Full output from Satori looks like:

2018-10-14T01:50:57.063663;00:0C:29:5F:9F:42;192.168.25.128;TCP;S;29200:64:1:60:M1460,S,T,N,W7:.;Ubuntu 18.x:5

Date/Time in UTC, ISO Format; SRC MAC; SRC IP; What protocol it came from; In the case of TCP, if it was a S or SA; TCP Fingerprint; OS Guess w/ Weight

In the case of TCP, the SRC MAC is normally worthless as it will be the router, but for DHCP and others it is the Unique Identifier I need.

Now the minor bad news....

pypacker doesn't have any built in ability to read live packets, it only has its ppcap piece that has read in it for captured files.  I'm going to dink around with pcapy later and verify if I can use it to read live packets and just feed the packets in similar to the pypacket buffer.  I have high hopes, but be a few days until I get some time to verify.

So what is next...

  1. Add feature to read specific capture files from disk instead of a single hard coded one that I keep manually renaming to test.pcap!
  2. Investigate pcapy for live capture input into the script instead of just saved files
  3. Update tcp.xml as it is WAY out of date!
  4. Get setup on github or something similar (think I already have account) and publish this as I go
  5. Add a ton of error checking into this, currently only one try: except: clause in it and was only added due to last round of testing when things were blowing up on my nmap pcap file.

Sunday, October 7, 2018

Satori Rewrite?

Ok, it has been 3 years since the last time I posted on rewriting Satori.....

I sat down Friday after work with pypacker.  I put an hour or two into getting it to read packets with some example code and parse through tcp packets to at least get my TCP fingerprinting under way.  I bounced a question off of the developer (one earlier in the week when I was hoping to start and then again once I finally did).  He was helpful in both cases and by the end of Saturday I had code in place that properly dissects TCP packets.  I have 2 pieces to fix, one which has always been my nemesis, bit shifting, never really got it in my head how that works 15 years ago when I wrote Satori in Delphi and now that I rarely program this type of stuff anymore, no better off with python.   The difference in 2018 though is I'm not writing the protocol dissectors anymore!

Pypacker isn't really decided per se for what I'm using it for, it is really more for making your own packets, but it has the ability to decode them as well!  He already has the protocol stack built out for almost everything I need, just missing SMB.  Once I get through TCP, DHCP and a few others I'll start looking at that one, but it will be a bit down the road.

The one difference with this rewrite vs the one I claimed in 2015, 2014, 2013.....  I'm actually really interested it doing this this time.  Code will also all be open sourced this time around and project will be hosted out on something like github.

Time permitting today, I should have TCP, p0f v2 style and ettercap done.   I hope to have something in place as well to actually parse through the fingerprint files and spit out a guess at the OS.  While I'd prefer to do DHCP as my first one, as that was where I really enjoyed this the most, TCP seems like the most useful.  Once I get this done I'll look at p0fv3 that came out in the 2014 time frame as I was really winding down my work in this field.

Anyway, if you are doing any type of python and network type stuff, I highly recommend you check out pypacker.   I had tried doing this before with scapy, dpkt and a few others, but they were all a bit slow on convoluted for me and didn't have enough of the protocols already built out.  Or maybe they really did and I just wasn't motivated enough, can't really say. 

Its fun to be working on this project again after this long break.  Once I get it moving along, fingerprint files will be updated again as well.

Initial output:
192.168.25.128:36526 -> 216.58.217.34:443
 Flags: S ,Fingerprint: 29200:64:4096:60:M1460,S,T,N,W7:.
216.58.217.34:443 -> 192.168.25.128:36526
 Flags: SA ,Fingerprint: 64240:128:0:44:M1460:A

The 4096 part is due to bad bit shifting on my part to read the don't fragment bit (reading 1 bit out of 16 is so much fun).  I did a kludge elsewhere in the code, but now that i remember about bitshifting, may have to go back and rewrite that.  But 95% of the way there on TCP at this point!

Wednesday, May 10, 2017

Fingerbank Collector

Ok, it has been eon's since my last post and this has more to do with other projects taking up my time in electronics than in fingerprinting, but I still like to dabble in this world as time permits.

Going back to 2007 and a lot of what Satori does/did, it looks like fingerbank has taken on, which is very cool to see!

https://fingerbank.org/collector.html?utm_source=pf-announce-en&utm_medium=email&utm_campaign=gartner_collector

ARP, DHCP v4 and v6, DNS and mDNS, HTTP and HTTPS, Radius (one I never tried to utilize) and TCP

They are doing it cloud based and it is closed source, but still cool to see what I was doing 7 years ago is making it main stream these days in yet another project.

Saturday, November 14, 2015

Satori rewrite

Ok, for years I've been planning on rewriting Satori in python (or something else) and never have gotten around to it.  Well 2 weeks ago I started playing with pyshark while working with SMB packets for FOR572 class.  More on that project in the future, but it got me thinking, why go to all of the headache of writing new code to parse all those packets, instead use the power of tshark, via pyshark.

So with that said, I really do plan on Satori 2.0 (or would it be 1.0 since I never made it out of the 0.7x arena).  The future releases of Satori will be pyshark/python based with tshark on the backend to do the heavy lifting.  I plan on just coding enough to pull the needed info and query the underlying .xml files for fingerprint data.  This will get it off the ground again, though may not make it as fast as it could be, but trade offs, it is that or I probably never get back to it :)

I'm not sure I can do everything I was doing with Satori before, but I can easily do dhcp, http agent string, and some of the smb stuff I was doing.

I'm thinking about adding some SSL fingerprinting to it also.

All of this to say, evidently Satori isn't dead from my end!  Just taken a bit of a break.

Tuesday, January 13, 2015

OS's - patching and support

In the past few weeks Microsoft has appeared a bit peeved with Google's disclosure policy.  They had a patched planned for Patch Tuesday (today) and the information about it hit the 90 day mark back at the end of December and was released by Google. 

There have been a number of threads going on the different lists I'm on, some supporting this saying Microsoft knew their policy and knew the dead line, while others upset at google who knew a patch was in the plans and released the data anyway.

I've been back and forth on Full Disclosure vs Responsible Disclosure over the years.  I see both sides and understand the needs.  I do believe the security researchers that find these bugs and push the vendors to get patches out the door are important, but I also believe a lot of these researchers (not all, but a lot) haven't had to support large organizations and deal with the "headache" these things cause.

In the end, supporting or trying to secure a large organization is tough to start with, made tougher by the numerous pieces of software and hardware that may be out there and made even tougher when you don't have total control over what is on your network (at least in the .edu space).  Add to that screwed up patches that get pulled and 3rd parties disclosing things "days" before a patch is due out, its almost enough to make you pull your hair out some days.

Microsoft has the problem of trying to make sure that things are backwards compatible, supporting things from 10+ years ago.  Google on the other hand just drives forward with a new OS and dropping support for older ones.

Case in point:
https://community.rapid7.com/community/metasploit/blog/2015/01/11/google-no-longer-provides-patches-for-webview-jelly-bean-and-prior

As we see more and more devices built on the Android OS and their support just ending, due to carriers not doing upgrades, or whatever, it will be interesting to see how things plays out in the future.  Will Google actually be the one that takes the heat or will the carrier (from the public).  Or will the idea of just throwing the old equipment away and constantly upgrading continue to be the norm?

I'm still running an old 2.3.x "smart" phone.  I don't surf the net with it, it gives me phone access and it gives me my calendar stuff.  It works for what I need, but I know its limitations and security implications if I surf the web with it.  How many users do?  Should we really be forced to spend that much every 2 years to replace older tech?  Maybe things change constantly, it isn't like when I started and we used to get new AV definitions every 6 months anymore :)  But I hate to see us continue to throw away perfectly working tech that could be patched.

Oh well, I digress.  Another Patch Tuesday is upon us and another will come next month.  Changes will continue to happen and those that have to support systems will continue to adapt or short of that move on to other things!

Friday, December 26, 2014

Pi clones

The raspberry pi was a great little device a few years ago.  Great price point, allowed you to run linux on a very small device that if lost/stolen wouldn't be the end of the world, but it also had very little horse power to do much of anything.  Trying to run a GUI on it was painfully slow due to the limited ram and single core proc running at 700 or so.

There have been a few different clones, found 2 I liked last night in searching around for things.

The ODROID-C1- http://ameridroid.com/products/odroid-c1
At $37 bucks hard to beat a quad cpu, 1 GB ram device.  It should provide a bit more omf when needed!  It is on my wish list to order now since I found it, perhaps today I'll do it.

The BPi-R1 - http://www.bananapi.com/index.php/component/content/article?layout=edit&id=59
Newegg is carrying these at $75 (http://www.newegg.com/Product/Product.aspx?Item=9SIA6DB29F2479).  While not as powerful as the ODROID-C1, this one is the kitchen sink, designed specifically for a home router type setup, it has what I've wanted which is a built in switch.  I've wanted to build a box to route game traffic through and do some manipulation of the traffic.  Not sure if it will be powerful enough, but doing some tcpdumps off this may be an option.

For now I'll stick with the ODROID-C1 I think and just put a USB to ethernet jack on it so I have a 2nd wired port to route stuff through.

Anyway, raspberry pi has made some great things possible.  I hope we continue to see higher powered systems (more ram actually) at the sub $50 mark.

Maybe I'll just build a few of these with ARPWatch on them and drop them on some of my closed networks!