*ren* PSN Down, Customer Info Compromised

Timeline of Sony Data Theft:

http://www.shacknews.com/article/68330/timeline-sony-data-theft

April 4 (Monday) - Hacker group Anonymous targets Sony for denial of service attacks, in retaliation for Sony legal action against George Hotz (aka "GeoHot") and Graf_Chokolo

April 7 (Thursday) - Anonymous halts attacks, apologizes for inconveniencing users, and acknowledges diverse points of view within hacker group

April 17-19 (Sunday-Tuesday) - PlayStation Network and Sony Online Entertainment hacked, user data stolen

April 19 (Tuesday, 4:15 pm PDT) - Sony Network Entertainment America network team notices unauthorized activity due to unscheduled server reboots; team begins running logs to analyze data

April 20 (Wednesday, early afternoon) - Sony forms larger team to assist the investigation; network team discovers unauthorized intrusion and that unknown data had been transferred from the PlayStation Network; network team shuts down PSN; Sony retains a security and forensic consulting firm to assist in the investigation; Sony begins mirroring suspected servers

April 21 (Thursday) - Sony retains a second security and forensic consulting firm; Sony issues a statement suggesting the network could be down for "a day or two"

April 22 (Friday) - Sony Computer Entertainment America general counsel provides FBI with information about the intrusion; network team finishes mirroring 9 of the 10 suspected servers; Sony issues a statement admitting an "external intrusion"

April 23 (Saturday) - Network teams determine that sophisticated hackers deleted log files to hide activity within the network; Sony issues a statement regarding re-building the network infrastructure for better security

April 24 (Sunday) - Sony decides to retain a third forensic team to help determine the scope of the breach

April 25 (Monday) - Forensic teams are able to determine that user data had been stolen, but could not rule out whether credit card information had been accessed

April 26 (Tuesday) - Sony notifies public of data intrusion; Sony also notifies regulatory authorities in New Jersey, Maryland, and New Hampshire;

April 27 (Wednesday) - Sony meets with FBI regarding data intrusion; Sony notifies the regulatory authorities in Hawaii, Louisiana, Maine, Massachusetts, Missouri, New York, North Carolina, South Carolina, and Puerto Rico; Sony tells SOE users that their databases and servers are kept separate, and therefore safe

April 28 (Thursday) - Hacker groups claim to be selling credit card data; security analysts confirm the discussions are taking place, but cannot confirm the legitimacy of the list; one hacker claims to have tried selling to Sony, but Sony denies any knowledge of such a sale

April 30 (Saturday) - Sony holds a press conference in Tokyo, apologizing for the data theft and detailing the PSN Welcome Back program; Sony says that some services will resume in the coming week
May 1 (Sunday, afternoon) - Sony detects intrusion into Sony Online Entertainment, including a file titled "Anonymous" that reads "We are Legion"

May 2 (Monday, morning) Sony Online Entertainment servers taken offline, with a brief statement, "we have discovered an issue that warrants enough concern for us to take the service down effective immediately."

May 2 (Monday) - Sony receives Congressional inquiry; Sony issues a statement that 12,700 credit cards and 24.6 million accounts were compromised in SOE data theft

May 4 (Wednesday) - Sony's Kaz Hirai responds to Congressional inquiry, implicating Anonymous Group
 
I'm sorry to tell him but just about every company runs obsolete software. Mainly cause server upgrades sometimes even just plain windows updates on a server is a pain in the ass. So a lot of companies tend to get to the state of it works how we want it to don't touch it we don't want to break it. So they only upgrade when they have issues.

Not to mention that some software breaks when Microsoft or others "fix" their software, making patches a science...
 
It can take weeks, months even, to roll out what would appear to be even the simplest patches or updates. It all depends on how large the network is and how much bespoke coding has been done to plug holes or provide extra levels of integration for the network users. Plus the fact that patches and updates are something of a black art as was pointed out above you have what amounts to a security nightmare.

I don't think there are many networks out there that could survive the sustained and prolonged attention of a skilled hacker. Especially ones as large and complex as Sony's are. In many ways other companies are probably giving a huge sigh of relief, 'There but for the grace go I...'.

It's also taken all the heat off Epsilon hack where they managed to lose almost the same subset of data as Sony have but for theses companies: Kroger, TiVo, US Bank, JPMorgan Chase, Capital One, Citi, Home Shopping Network, Ameriprise Financial, LL Bean Visa Card, McKinsey & Company, Ritz-Carlton Rewards, Marriott Rewards, New York & Company, Brookstone, Walgreens, The College Board, Disney Destinations, Best Buy, and Robert Half Technologies.

Now that's a leak!!
 
It's also taken all the heat off Epsilon hack where they managed to lose almost the same subset of data as Sony have but for theses companies: Kroger, TiVo, US Bank, JPMorgan Chase, Capital One, Citi, Home Shopping Network, Ameriprise Financial, LL Bean Visa Card, McKinsey & Company, Ritz-Carlton Rewards, Marriott Rewards, New York & Company, Brookstone, Walgreens, The College Board, Disney Destinations, Best Buy, and Robert Half Technologies.

Now that's a leak!!

Noooooooo my marriot rewards.....
 
It can take weeks, months even, to roll out what would appear to be even the simplest patches or updates. It all depends on how large the network is and how much bespoke coding has been done to plug holes or provide extra levels of integration for the network users. Plus the fact that patches and updates are something of a black art as was pointed out above you have what amounts to a security nightmare.

I don't think there are many networks out there that could survive the sustained and prolonged attention of a skilled hacker. Especially ones as large and complex as Sony's are. In many ways other companies are probably giving a huge sigh of relief, 'There but for the grace go I...'.

It's also taken all the heat off Epsilon hack where they managed to lose almost the same subset of data as Sony have but for theses companies: Kroger, TiVo, US Bank, JPMorgan Chase, Capital One, Citi, Home Shopping Network, Ameriprise Financial, LL Bean Visa Card, McKinsey & Company, Ritz-Carlton Rewards, Marriott Rewards, New York & Company, Brookstone, Walgreens, The College Board, Disney Destinations, Best Buy, and Robert Half Technologies.

Now that's a leak!!

I received, at a minimum, 10 e-mails post Epsilon breach. Some MUCH sooner than others...:???:
 
Not certain if this was already shared or not, but here's a few excerpts from: http://consumerist.com/2011/05/secu...re-was-obsolete-months-before-psn-breach.html

According to Spafford, security experts monitoring open Internet forums learned months ago that Sony was using outdated versions of the Apache Web server software, which "was unpatched and had no firewall installed." The issue was "reported in an open forum monitored by Sony employees" two to three months prior to the recent security breaches, said Spafford.

Spafford made his comments in a hearing convened by the House Subcommittee on Commerce, Manufacturing, and Trade. Sony was invited to participate in the hearing, but declined to attend.
 
At this point I just want the damn service back up. I bought an HRAP3 for MK and have been considering purchasing a second one, but if this outage is going to persist for weeks, then I might reconsider and just consolidate on the 360. 3D and exclusive characters were great selling points for the PS3 version, but no online play is a huge negative.
 
It can take weeks, months even, to roll out what would appear to be even the simplest patches or updates. It all depends on how large the network is and how much bespoke coding has been done to plug holes or provide extra levels of integration for the network users. Plus the fact that patches and updates are something of a black art as was pointed out above you have what amounts to a security nightmare.

I don't think there are many networks out there that could survive the sustained and prolonged attention of a skilled hacker. Especially ones as large and complex as Sony's are. In many ways other companies are probably giving a huge sigh of relief, 'There but for the grace go I...'.

That's why you have a test environment and a proper patch management policy in place.

Patch management processes are quite simple to follow.

1. Identify severity of patch
2. Implement in a test environment
3. Schedule a rollout based on severity

A severe vulnerability should be addressed quickly. An enterprise service like PSN, will have multiple servers in a high availability scenario meaning patching can be done during production in a rolling manner once testing is completed

Outside of a breakdown in procedures and negligence, there is NO viable scenario in which the servers should be left unpatched for months. I'm sorry but no one actually managing Enterprise systems would agree with this being normal.

I understand taking a few days to a week to run through the QA process but months...? Maybe the much needed Chief Security Office position they are creating will address this clear flaw.
 
@ban25
if in a haste, in the meantime you can play online using XBSLink.

i have not tried on PS3, but it say it support PS3. (it first made for xbox, then also support ps3)
 
That's why you have a test environment and a proper patch management policy in place.

Patch management processes are quite simple to follow.

1. Identify severity of patch
2. Implement in a test environment
3. Schedule a rollout based on severity

A severe vulnerability should be addressed quickly. An enterprise service like PSN, will have multiple servers in a high availability scenario meaning patching can be done during production in a rolling manner once testing is completed

Outside of a breakdown in procedures and negligence, there is NO viable scenario in which the servers should be left unpatched for months. I'm sorry but no one actually managing Enterprise systems would agree with this being normal.

I understand taking a few days to a week to run through the QA process but months...? Maybe the much needed Chief Security Office position they are creating will address this clear flaw.

As much as I am a big fan of Sony you are completely correct. I manage a system where I work and I know for a fact that the suppliers of said system have an identical test server for such updates and I have a test server on site for testing new software - it may take months to plan though...depending on the severity of the issue and complexity of the 'fix'.

I do believe Sony were slow to react but I also believe the dates provided above (I'm sure the FBI will catch them out if anything is untrue) - so Sony told us pretty much as soon as it was confirmed (within a day) not 7/9 days as many seem to suggest.

I wonder if Sony were working on a complete migration and this is why it was left unpatched for so long...it would also explain why they can 'all of a sudden' migrate to new 'more secure' servers when usually such excersises take months of planning.
 
They were working on a complete migration to a new physical location, according to these details. We have been told they have expedited the transition to a new location as part of their system upgrade.
 
They were working on a complete migration to a new physical location, according to these details. We have been told they have expedited the transition to a new location as part of their system upgrade.

Yes, sorry - I was aware of this comment (I had actually suggested this elsewhere before Sony even said it)...I meant to say that maybe this confirms what they said. (sorry my memory is playing tricks with me!)
 
That's why you have a test environment and a proper patch management policy in place.

Patch management processes are quite simple to follow.

1. Identify severity of patch
2. Implement in a test environment
3. Schedule a rollout based on severity

A severe vulnerability should be addressed quickly. An enterprise service like PSN, will have multiple servers in a high availability scenario meaning patching can be done during production in a rolling manner once testing is completed

Outside of a breakdown in procedures and negligence, there is NO viable scenario in which the servers should be left unpatched for months. I'm sorry but no one actually managing Enterprise systems would agree with this being normal.

I understand taking a few days to a week to run through the QA process but months...? Maybe the much needed Chief Security Office position they are creating will address this clear flaw.

Boy that sounds real nice.... until you step into an environment that doesn't have any of that in place, is in full production, with a large environment and is a conglomeration of inherited and legacy systems from past administrations, with very few of the people from those days still remaining (thankfully, and for good reason). Now, with that try implementing proper test and prod environments when management doesn't see the cost justification (until its too late) and you're limited on soft resources as well. In my experience, that's the more likely scenario than an environment with actual duplicate testing and production systems. Frankly its rather scary some of the places that have these crappy setups considering their size, name/reputation and the type of data they handle for people.

By the way, my rant it not an excuse for Sony. I'm just standing up for the IT guys that don't have the luxury of test/dev/prod due to no fault of their own. "A breakdown in procedures and negligence" is just simply the reality for many environments and moving away from that is a monumental task. Not an excuse, just simply a reality for large corps that's far more common then your post would imply.
 
Boy that sounds real nice.... until you step into an environment that doesn't have any of that in place, is in full production, with a large environment and is a conglomeration of inherited and legacy systems from past administrations, with very few of the people from those days still remaining (thankfully, and for good reason). Now, with that try implementing proper test and prod environments when management doesn't see the cost justification (until its too late) and you're limited on soft resources as well. In my experience, that's the more likely scenario than an environment with actual duplicate testing and production systems. Frankly its rather scary some of the places that have these crappy setups considering their size, name/reputation and the type of data they handle for people.

By the way, my rant it not an excuse for Sony. I'm just standing up for the IT guys that don't have the luxury of test/dev/prod due to no fault of their own. "A breakdown in procedures and negligence" is just simply the reality for many environments and moving away from that is a monumental task. Not an excuse, just simply a reality for large corps that's far more common then your post would imply.

But this is Sony, a massive company holding secure data...regardless of the complexities there should be a test environment, how long has PSN been running now? I'm sorry but they've had plenty of time to build a new suitable and stable alternative and then flick a switch one evening.

This isn't a dig at anyone on the ground, upper management who are 'penny pinching' are to blame.
 
Thanks for the link, I really liked this paragraph:

I thought SOE fell first and then PSN was hacked two days later, while the discovery of the hacks happened in the opposite order.

If so, why attack SOE first if the discovery of the vunerability came from the PS3 hack?

It seems likely to me that the hackers found a vunerabiltiy in SOE's security and then use something like bad password management (shared passwords across systems) on Sony's part to simply access PSN.
 
As much as I am a big fan of Sony you are completely correct. I manage a system where I work and I know for a fact that the suppliers of said system have an identical test server for such updates and I have a test server on site for testing new software - it may take months to plan though...depending on the severity of the issue and complexity of the 'fix'.

I do believe Sony were slow to react but I also believe the dates provided above (I'm sure the FBI will catch them out if anything is untrue) - so Sony told us pretty much as soon as it was confirmed (within a day) not 7/9 days as many seem to suggest.

I wonder if Sony were working on a complete migration and this is why it was left unpatched for so long...it would also explain why they can 'all of a sudden' migrate to new 'more secure' servers when usually such excersises take months of planning.

But Sony's servers are not easily accesible like the usual hacks that plague Windows Operating Systems, you have to have inside access to proprietary Sony software, and these hackers got access to such things as well as reverse engineering/hacking of the Sony firmware that unless they did not have the official documentation they would have never been able to simply crack it.

I personally am very disappointed at how the mainstream tech media keeps making it sound like these hackers are intelligent when they just had access to stuff the average consumer is not supposed to have access to so its no suprise otherwise they would have hacked the PS3 way back in 2006 or early 2007 even if OtherOS was never offered.
 
http://blog.us.playstation.com/2011/05/05/important-step-for-service-restoration/

Today our global network and security teams at Sony Network Entertainment and Sony Computer Entertainment began the final stages of internal testing of the new system, an important step towards restoring PlayStation Network and Qriocity services.

http://blog.us.playstation.com/2011...ction-in-the-united-states-through-debix-inc/

A $1 million identity theft insurance policy per user

http://blog.us.playstation.com/2011/05/05/a-letter-from-howard-stringer/
 
One year of identity theft insurance is a pretty common in these situations. I wonder what Sony actually pays when they sign a contract for so many customers? I bet it isn't very much per customer.
 
It's probably not even that they just signed such a contract. It's probably covered under some insurance they bought against this type of thing.
 
But Sony's servers are not easily accesible like the usual hacks that plague Windows Operating Systems, you have to have inside access to proprietary Sony software, and these hackers got access to such things as well as reverse engineering/hacking of the Sony firmware that unless they did not have the official documentation they would have never been able to simply crack it.

I personally am very disappointed at how the mainstream tech media keeps making it sound like these hackers are intelligent when they just had access to stuff the average consumer is not supposed to have access to so its no suprise otherwise they would have hacked the PS3 way back in 2006 or early 2007 even if OtherOS was never offered.

Sorry, I wasn't aware of the insider info - it was implied somewhere IIRC but I don't recall it being confirmed?
 
Back
Top