Forensic Lab With Links to Montana DOJ Leaks Phone Extracts
Multiple extracts for cases involving a cop suicide, homicides, CSAM and more were left publicly exposed.
Disclaimer: This post will outline my perspective on the events that unfolded from my discovery of the leak to its resolution. You can read more details about the incident from others involved at the links at the end of the post.
On June 5th, while going through alerts I got recently for exposed endpoints, one in particular caught my attention as the file listing mentioned “warrants”.
I set a script to list all the files exposed on the IP address, and while I waited, I started looking at what I initially listed. I started scrolling, and I kept getting more and more concerned about what I had just stumbled upon.
Shocking initial findings
One of the exposed network shares, called “Evidence”, was listing multiple TBs of data, and some of the directories were named in a way that made them easy to match to media reports due to how severe the cases were; with a few searches I linked two of them to a cop suicide and a homicide case. This got even worse when I noticed another IP address, clearly related, was also running exposed and contained multiple mentions of “CSAM” in directories and files.
Now, I’ve stumbled upon a few honeypots over the years, but this was not a honeypot at all; the information was matching multiple news reports, and the software and tools listed were all tools that police use routinely.
This meant that I would have to try to get this shut down as fast as possible, but it also meant I had to be extra careful handling it, as I didn’t want to start accidentally downloading potentially illegal files.
Almost skipped over fixing this
The day after I initially started looking into this, on June 6th, I noticed the file listing stopped, not because it finished, but because the server wasn’t online. My initial thought was that they had some kind of system that flagged my activity, and they fixed the exposure. On the morning of June 10th, since a few days had passed, I tried to connect one more time to confirm that the exposure was fixed and the server was back online and exposed.
Perhaps the infrastructure hosting the servers was turned off for the weekend? I can’t confirm what happened exactly, though.
Back to investigating
I noticed that with each phone extract, a PDF report from the tool used was made, so I pulled down a few of the recent ones to try and see if they pointed to the entity responsible for the server.
One of the most recent phone extracts on it was created the day after I initially looked at this, and it related to a phone extract of a Cascade County employee for a “Policy Violation”.


Usually, at this point in my discovery, where I find something serious but I can’t ID whoever is responsible, I escalate to the relevant authorities or agencies within that country.
The issue is, every time I tried that in the past, either by CISA or IC3 report pages, the only outcome was me being ignored, and the only replies I received were my tickets being closed weeks after I filed them, with no resolution at all.
Waiting for weeks for a slim chance of a reply would put these servers at an even bigger risk, as this service is frequently hit by automated ransomware scripts, so I asked for help from people who have helped me with notifications in the past.
On June 12th, I asked Martin Seeger (@masek), who was helping me with another issue, if he wanted to help in yet another exposure.
Martin made a post on his social media trying to gather contacts for help, and we discussed privately and investigated what we could do to try and identify anyone involved to contact for help with the information available.
On June 17th, Dissent from https://databreaches.net also joined the efforts and made some contacts.
On the same day, thanks to both their latest contacts, the exposure was fixed.
Summary of events
April 29th - I start a run to scan for exposed shares on this service, one of the IPs is included on this list
May 14th - One of the servers is alerted on my feeds
June 3rd - A second server is alerted on my feeds, from a newer run of IPs
June 5th - While doing research, I spot the June 3rd alert and start investigating
June 6th - The servers go offline
June 10th - I double-check to confirm the issue was fixed and the servers are back online
June 12th - With no one responsible identified, I get in touch with Martin for help with notifications.
June 17th - Dissent joins the efforts and contacts some more people.
June 17th - Both Dissent and Martin's efforts, almost at the same time, manage to get someone in contact with the lab responsible, and the exposure is fixed.
You can read Martin’s perspective here: https://infosec.exchange/@masek/114721620930871030
You can also read Dissent’s perspective here: https://databreaches.net/2025/06/22/a-state-forensics-lab-was-leaking-its-files-getting-it-locked-down-involved-a-number-of-people
Final Comments
The links to the Montana DOJ were made through the GrayKey serial number, and some of the contacts made by Martin and Dissent, explained further on their posts.
Due to how severe the exposure looked just by the file listing, I tried to access as few files as possible, which also meant that it took a lot of time from the people involved to investigate who to reach out to with just small pieces of information. A special thanks to Martin and Dissent for volunteering their time to help fix what, for me, is probably the most serious data exposure I've personally found to date.
I did not access any actual phone extract; I only looked at a few TXT and PDF files, but the file listing showed actual phone extracts and over 5TB of exposed data. I never got a full listing of what exactly was exposed, so I can’t accurately tell how much data it contained and any values are based only on what I ended up listing.
If you’re interested in more incidents I dealt with, you can check all my public finds indexed by country on the post below: