Friday, March 22, 2013



Extortion works
.
It all started at the PyCon Conference in Santa Clara this past week when a couple of guys made a couple of crude jokes about “forking software repositories” and “big dongles.” A dongle is a piece of hardware with some software present on it that’s used as a key to unlock software so that one may use it. Its purpose is to prevent unauthorized copying and distribution. In this instance, the reference to the term was anatomical rather than cryptographic. The inappropriate jokes violated the conference code of conduct.

Adria Mitchell, who sat in front of the men, reported them to the conference administrators who escorted them out of the session. She also snapped their picture and posted comments about the incident to Twitter and her blog. http://butyoureagirl.com/14015/forking-and-dongle-jokes-dont-belong-at-tech-conferences/http://butyoureagirl.com/14015/forking-and-dongle-jokes-dont-belong-at-tech-conferences/

Adria Mitchell wasn’t just any old blogger or conference attendee, though. She had a fairly influential position in Developer Relations at Sendgrid, a company that sends bulk commercial Internet communications. Her position in Developer Relations is important in the events that transpired.

One of the men was let go by his company, PlayHaven, in response to the post. After his dismissal, the man who was fired allegedly posted that he was a father of three and that he was disappointed because he really liked the job. Developers responded by demanding on the pastebin site that Mitchell also be fired.

On Thursday, March 21, Sendgrid came under a distributed denial of service attack and a number of developers cancelled their accounts. Adria Mitchell’s personal site also came under attack and she received death and rape threats. By afternoon Thursday, Sendgrid issued word on Twitter and Facebook that they fired Mitchell. The Sendgrid Facebook post stated, “Effective immediately, SendGrid has terminated the employment of Adria Richards. While we generally are sensitive and confidential with respect to employee matters, the situation has taken on a public nature. We have taken action that we believe is in the overall best interests of SendGrid, its employees, and our customers. As we continue to process the vast amount of information, we will post something more comprehensive.”

What’s the take-away from this? Well, like I said, I think we can certainly take away the fact that extortion works. Sendgrid’s site was down for most of the day on Thursday, the day that many newsletters and press releases are sent. The company capitulated to the demands of its attackers.

However, Ms. Mitchell didn’t fire the man who lost his job, PlayHaven did. Nobody targeted PlayHaven.

We can expect to see litigation come out of this—criminal, civil, employment and commercial. Websites and services were hacked. Distributed denial of service attacks took down services that lost thousands, if not hundreds of thousands of dollars. People were threatened with physical harm. People lost their jobs, perhaps wrongly. Companies didn’t have their e-mailings go out on time due to Sendgrid being down. And for what? Because a couple of guys made some inappropriate sexual comments at an industry conference. Perhaps we can all learn something in hindsight from this about the value of treading lightly. While this is an extreme example, it is the reality of our world today. A simple post caused this damage. Before any of us acts or renders advice in response to a situation, we really need to give pause and consider the possible ramifications of our actions.


Monday, November 19, 2012

Preview for Broad Range of Criminal Activity Finds Child Pornography- 6th Cir Reverses Trial Court on Suppression



In US v Schlingloff, the 6th District reversed the trial court's denial of a motion to suppress. The digital forensic examiner was executing a search warrant to look for evidence of passport fraud. He used a feature of the digital forensics software FTK (which is the same sort of software tool as EnCase) to look for a broad range of files indicative of criminal activity, including child exploitation images. The search yielded files containing suspected child pornography and the defendant sought to suppress that evidence, arguing that the search exceeded the scope of the warrant. The trial court denied the motion to suppress, but the Circuit Court reversed, citing the agent’s purposeful choice to search for the child exploitation material despite it not being specifically enumerated in the warrant. The feature the agent used required that he select search criteria that included seeking files likely to contain child pornography in addition to files that would likely contain evidence of passport fraud.



Thursday, August 2, 2012

A couple of really good recent e-discovery articles


ABA E-Discovery Expert Says Emerging Standards Are Premature

WRAPPING YOUR ARMS AROUND e-DISCOVERY

By John G. Horn and Michael McCartney

Judge Scheindlin Issues Strong Opinion on Custodian Self-Collection

Government agencies are expected to protest vociferously arguing it is overburdensome.

By Ralph Losey


and download this White Paper:

By Joshua L. Konkle and Charles Skamser 

DCIG is a company that analyzes software, hardware and services companies within the enterprise data storage and electronically stored information (ESI) industries. Available from DCIG with registration at http://www.dcig.com/2012/07/dcig-announces-the-industrys-first-most-compr.html and for free from Guidance Software at http://www.guidancesoftware.com/dcig-2012.htm

Tuesday, July 10, 2012

DNSChanger Surprise!

DNSChanger hit and the Internet survived. As with most scares fueled by media hype and top government sources, the virus turned out to be much ado about pretty much not much.


Did anybody go to the FBI website to check their PC to see if it was infected? If they did, was it? Just curious.  

Saturday, June 16, 2012

Digital devices and miscarriages of justice

We carry our lives on digital devices. For most of us, the information they contain is perfectly innocent. But digital forensics as it’s practiced today can make innocent information look incriminating. That means we may be putting innocent people in jail and letting criminals off. While other forensic science disciplines have come under harsh scrutiny lately, the problems with digital forensics have not received enough attention
.
A 2009 study by the National Academy of Sciences sounded the alarm on faulty forensics. The report said most methods of analysis have not been “rigorously shown to have the capacity to consistently, and with a high degree of certainty, demonstrate a connection between evidence and a specific individual or source.” The report challenged the reliability of ballistics (“toolmark and firearms identification”), bite mark comparisons, blood spatter analysis, handwriting analysis and even fingerprint examination. The report said little about digital forensics, however, because it is still an “emerging” discipline.

It’s time for a critical look.

There is solid science behind much of digital forensics. We know, for example, that computer hard drives must be copied without altering the contents of the disk. Best practices in digital forensics also are solid. But digital forensic analysts don’t always follow best practice.

Consider some of the following examples, which we have witnessed in Connecticut and nearby jurisdictions.

A police officer “expert” found images from “unallocated space,” the part of a hard drive the computer isn’t using, which may contain deleted files. The officer asserted in an examination report that images retrieved from unallocated space were downloaded by the defendant and deleted.

But such an assertion is not supported by fact. Data can get into unallocated space on a hard drive in a number of ways. In this case, the only appearance of the data was in unallocated space. There was no basis for the examiner to assert that the images had ever been “files” that were subsequently “deleted.”

Here’s another example: A computer’s operating system creates hundreds of copies of the same images, which are called “restore points.” A police officer “expert” recently recovered restore points on a defendant’s hard drive that contained the same two child-porn pictures. The officer duplicated the pictures so many times that he recommended charging the defendant with possession of more than 600 images, nearly all of them the same.

Another police officer “expert” violated a court order when he searched for privileged attorney-client documents on a defendant’s computer, and then handed them over to the prosecutor.

Examination reports often include conclusions from examiners that items were “intentionally downloaded” by the defendant. But it is impossible to arrive at such a conclusion without being present when the defendant actually downloaded the material, or without a videotape of the event.
Poor training is a big part of the problem. Thousands of police officers have been trained to perform digital forensics under federal grant programs. But these police officer examiners are not required to possess any special training or education beyond a minimum level. The 40 hours or so of training they receive in the forensic software they use is typically the extent of their computer science background prior to their first case assignment.

Despite the minimal training of many digital forensics examiners, their findings are often unquestioningly accepted as fact.

Digital evidence can be compelling and it is often unambiguous. In too many cases, however, digital forensics experts make assertions about a defendant’s actions that are not supported by fact. Such errors create the risk of false conviction of the innocent and a free pass for the guilty.

We need higher standards and more professionalism in digital forensics. And we need to give digital forensics the sort of close scrutiny that all the other forensic science disciplines have been getting in recent years.

Roger Koppl, a research fellow at the Independent Institute, Oakland, Calif., is a professor of economics and finance at Fairleigh Dickinson University and director of the university’s Institute for Forensic Science Administration. Monique M. Ferraro is a lawyer and information security and digital forensics consultant at Technology Forensics, LLC, Waterbury, CT.


Read more: http://dailycaller.com/2012/06/15/digital-devices-and-miscarriages-of-justice/#ixzz1xzmYtuG4

Wednesday, May 16, 2012

ISC2 Hacked? Batten down your hatches

Ok, I cannot be the only person in the world who thinks this is ironic and amusing. The International Information Systems Security Certification Consortium, Inc. (ISC2) has had some of its websites go down recently. One can only presume they were hacked, since both of the websites were set up to be portals -- one for registering to teach internet safety to kids and another to vote for an awards program. 


The ISC2 issues the most prestigious certifications offered in the field of information security- the Certified Information Systems Security Professional (CISSP, which has specializations in architecture, engineering and management), a certification which I hold, the Systems Security Certified Practitioner (SSCP), Certified Authorization Professional (CAP), Certified Secure Software Lifecycle Professional (CSSLP).


If the ISC2 people can get hacked ANYBODY can get hacked. If they aren't secure, NOBODY is secure. If you have important information, back it up and encrypt it. Just sayin. 

Monday, May 14, 2012

Child Porn Decision Turns On Downloading Intent

New York ruling highlights gray area of Connecticut law


James Kent, a public administration professor at Marist College in Poughkeepsie, N.Y., was convicted of hundreds of counts of procuring and possessing child pornography via the Internet on his work computer. Last week, New York’s highest court reversed the convictions that were based on images located in temporary Internet or “cache” folders on his computer hard drive. The national headlines shouted that the New York Court of Appeals ruled that looking at child porn is not a crime. But neither the decision, nor the technology that guided the justices toward it, is quite that simple.

What the Court of Appeals ruled is that the prosecution must show that a defendant did more than simply view images on a computer screen. According to the majority decision, “some affirmative act is required (printing, saving, downloading, etc.) to show that defendant in fact exercised dominion and control over the images that were on his screen.” But in this case, the justices ruled that the images and videos were apparently downloaded from web sites through the automatic functions of the operating system of the defendant’s computer, and thus there was no proof that the defendant knowingly committed a crime. This holding is consistent with those in some other states and federal circuits, but has not been addressed in Connecticut as yet. This is an important issue because prosecutions are regularly moving forward in the state based on images located in temporary Internet storage and a number of defendants have been convicted.


Accidental Access Generally speaking, when you go to a web site, images are downloaded to temporary storage on your computer — whether it’s a personal computer, pad, laptop or certain smartphones. This temporary storage is called “cache.” The pictures and video are temporarily stored to make it easier for your computer to display those images from the web site if you go back. It makes the processing time faster. This is an automatic process conducted by your computer’s operating system.

Yes, that means you or a client can accidentally access child pornography unknowingly. There may be pictures or videos that depict child pornography that you haven’t viewed that get automatically downloaded and stored in temporary Internet storage or cache. Yes, that means that even if you or a client accidentally access child pornography and try to delete it, if the police find out about it, they will make an arrest, push to prosecute and the resultant conviction will garner a mandatory minimum sentence of incarceration. In Connecticut, for fewer than 20 images, the mandatory minimum term is a year; for 20 to 49 images, two years; for more than 50, three years. One sentenced for a child pornography offense must register as a sex offender upon release from prison.
Compare images located in cache to files intentionally saved by the user. Files saved by a user will be found in folders like “My Documents” or “My Pictures.” Forensic software like EnCase and Forensic Tool Kit can help prosecutors, defense attorneys and their experts figure out whether files have been accessed, modified or deleted and when these actions occurred. Files located in temporary Internet storage most often are never accessed after they have been initially downloaded. That can be interpreted to mean that that the user either didn’t know the files were there or that they couldn’t access the files, or both.

Collectors of child pornography usually have many pictures and videos — they number in the hundreds, thousands, and hundreds of thousands, and serious collectors categorize their collections into folders. It doesn’t take many cases before one can discern the serious offenders.


Unallocated Space
In addition to data in temporary storage and purposefully saved files, there is unallocated space on digital media. Unallocated space may be empty. It may contain complete files, or it may contain incomplete files or data. Sometimes deleted data can be “carved” from unallocated space by forensic software. The software guesses what type of file the data once was and attempts to reconstitute it. Speculating as to the meaning of data in unallocated space is more alchemy than science or law. How the trial court in People v. Kent came to its conclusion that the defendant was guilty of possessing images located in unallocated space but not in temporary Internet is fact specific and should not be applied to data found in unallocated space in general, because unallocated space is a much different animal than temporary Internet storage.

Still, that shouldn’t diminish the impact or import of the court’s holding regarding data held in temporary storage. There are several cases holding that data in unallocated space is not knowingly possessed for the same reasons the court held that files located in temporary Internet storage are not possessed in the Kent case — because the user did not know that the files were being saved and the user could not access the files without specialized software.

In the New York case, Justice Victoria Graffeo wrote in a concurring opinion that, according to the majority opinion in the case, “it is [now] legal in New York to knowingly access and view child pornography.” But it’s not easy to prove that someone viewed something. A person can accidentally access a web site and their computer will download hundreds of pictures or videos. While it is possible to prove that a web site was visited for a certain period of time, it isn’t possible to determine what pictures on that web site someone looked at, or even if the person was looking at the web site the whole time.

Still, police and prosecutors in Connecticut have in the past and today continue to push these cases, even as New York, other states and some federal jurisdictions abandon the practice.

This isn’t a matter of advocating for child pornographers or sex fiends. Everyone agrees that child pornography is odious. The child sex assault and exploitation that the pornography chronicles is, without question, an insult to our humanity and an unrelenting victimization of the minors depicted. No one would ever marginalize those souls or minimize their anguish. Yet, to effectively deal with this issue, we must recognize that there is more to it than the pictures themselves.

Child pornography cases should interest us all because they are at the cutting edge of electronic evidence cases. We will see the most salient legal issues tested in those cases first before the principles are applied to other areas of the law. People v. Kent demonstrates the necessity of analyzing the legal issues rather than focusing on the visceral recoil we experience at the offense. It is a good case to look at because the defendant was guilty in part and not guilty in part. The court held that some of the child pornography on his hard drive was possessed knowingly — the images in unallocated space — but the images stored in cache — in temporary Internet storage — were there without his knowledge and therefore not unlawful. •

Thanks to the Connecticut Law Tribune, where this appeared in the Monday May 14, 2012 issue at http://www.ctlawtribune.com/getarticle.aspx?id=42167 online.