I’m sure by now you’ve heard about the University of Rochester study connecting fast-paced video games to faster decision-making—if not from the Twitterverse, then from your teenagers who are trying to convince you that Halo Reach is going to help them on their SATs. So, is it true? Should you make sure your kids are getting in their 50 hours a week of Left 4 Dead 2?
Well, while I think there’s plenty of documentation about video game skills translating to certain improvements in real-life skills (surgeons performing lacroscopic surgery, for instance), what I’ve read about the study makes me pause a little.
From what I could understand of the study, they found a bunch of 18- to 25-year-old males and females who don’t normally play video games, and then had half of them play 50 hours of Call of Duty 2 and Unreal Tournament, and then the other half played Sims 2. Afterward, they were asked to perform tasks which were largely (if not entirely) computer-based. The action game players were up to 25 percent faster, and just as accurate, as those who played Sims 2.
But this raises a few questions: does playing video games actually help with non-computer-related decision-making? I mean, I have trouble deciding what to make my family for dinner each evening. Firing up my Xbox probably isn’t going to help me with that. (Or maybe it will: “Hey, we’re all having ramen for supper! Again!”) Also, how do we know that playing Sims 2 isn’t detrimental to decision-making? It’s unclear to me whether the individuals were tested before and after their 50-hour gaming sessions and, if so, how much their decision-making abilities improved or decreased from their own baselines. Without a non-gaming control group, it’s hard to know how gamers compare to non-gamers.
And of course, the biggest question I’m sure we’re all asking: How did they manage to find 18- to 25-year-olds who don’t play video games?
Image: theogeo on Flickr, use under Creative Commons License.