Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by mrcoolbp on Wednesday April 08 2015, @05:25PM   Printer-friendly
from the simon-says-don't-overthink-it dept.

MedicalXpress is reporting on new research into how our neural systems learn new skills. Led by UC Santa Barbara's Scott Grafton and colleagues at the University of Pennsylvania and Johns Hopkins University, they sought to answer the question: "Why are some people able to master a new skill quickly while others require extra time or practice?"

Researches used Functional Magnetic Resonance Imaging (fMRI) to identify regions of the brain involved with learning and new skill acquisition while subjects played a simple game. Rather than focus on specific areas of the brain for short periods of time, the researchers took a more holistic approach, examining the process of learning a more complex skill over a longer period of time.

Some of the results were surprising. Interestingly, using more of your brain won't help you learn more quickly; instead, as "counterintuitive as it may seem, the participants who showed decreased neural activity learned the fastest."

From the article:

The researchers discovered that the neural activity in the quickest learners was different from that of the slowest. Their analysis provides new insight into what happens in the brain during the learning process and sheds light on the role of interactions between different regions. The findings, which appear online today in Nature Neuroscience, suggest that recruiting unnecessary parts of the brain for a given task—similar to overthinking the problem—plays a critical role in this important difference.

At UCSB's Brain Imaging Center, study participants played a simple game while their brains were scanned with fMRI. The technique measures neural activity by tracking the flow of blood in the brain, highlighting which regions are involved in a given task.

Participants responded to a sequence of color-coded notes by pressing the corresponding button on a hand-held controller.

The study continued with participants practicing at home while researchers monitored their activity remotely. Subjects returned to the Brain Imaging Center at two-, four- and six-week intervals for new scans that demonstrated how well practice had helped them master the skill. Completion time for all participants dropped over the course of the study but did so at different rates. Some picked up the sequences immediately, while others gradually improved over the six-week period.

"Previous brain imaging research has mostly looked at skill learning over—at most—a few days of practice, which is silly," said Grafton, who is also a member of UCSB's Institute for Collaborative Biotechnologies. "Who ever learned to play the violin in an afternoon? By studying the effects of dedicated practice over many weeks, we gain insight into never before observed changes in the brain. These reveal fundamental insights into skill learning that are akin to the kinds of learning we must achieve in the real world."

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by MichaelDavidCrawford on Thursday April 09 2015, @04:37AM

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Thursday April 09 2015, @04:37AM (#168171) Homepage Journal

    I myself graduated high school in 1982 - Armijo High School in Fairfield, California.

    When I was in eighth grade, our Algebra I book discusses the PL/I programming language extensively. I was hoping we'd get to learn computer programming but no, we only discussed Algebra.

    At my high school, they had a "Programmable Calculator" class, in which the Calculators were the size of a regular typewriter. They did have magnetic card storage, so one could save one's source code.

    The entire school district shared just one DECSYSTEM 20. Solano Community College had just one DECSYSTEM 10.

    The school district's DECSYSTEM was only for administrative use - the students were not normally permitted to use it. However I was able to argue that I should be able to do so, as I had - strictly speaking - already graduated high school by passing the California High School Proficiency Examination, then got As in BASIC, FORTRAN and COBOL at Solano Community College.

    For the school district's DEC-20 to be made available to any significant number of students would have brought it to its knees. When I was at Solano in the Fall of 1980, I would submit a FORTRAN or COBOL hollerith deck late in the evening, then retrieve my deck as well as the ALL UPPERCASE LINEPRINTER HARDCOPY early the following evening. To be clear: one, single Edit/Compile/Test cycle took roughly 20 hours.

    That wasn't working well for anybody, especially because SCC also used its DEC-10 for payroll, student grades and student class schedules, so it invested in a Prime minicomputer for the computer programming classes, as well as a couple dozen glass TTYs. However a friend who used the Prime a year or two later told me that with so many users, it was dog-slow.

    When I was at Caltech, almost all of the VAXen ran VMS. There were a very few BSD UNIX VAXen, mostly in the Physics department. I'm not sure but I think the CS department may have had a BSD VAX, but it wasn't available for casual users. I myself had an account on TIMEVAX, where I attempted - but failed, to write my very first Conway's Game of Life [warplife.com] implementation in FORTRAN 77 with VT100 ASCII graphics.

    We had two VAX 11/780s in the Astronomy department, where I worked as a research assistant; my second summer there they bought an 11/750 for $150,000.00, also a VAXStation.

    There was a huge, long, loud fierce debate as to whether the VAXstation was worth all the money, as opposed to getting another 11/780.

    We also had two Grinnel image processing units. They were capable of 512x512 pixels, with a depth of 16-bits, however they could only actually _display_ eight of the bits! That is, one could adjust the Grinnel so that only bright stars were displayed, or one could show dim stars while the bright stars looked like overexposed photographic prints. I've never seen that effect on any other system, it was quite cool.

    The Space Telescope Wide Field and Planetary Camera was commonly called the "Four-Shooter" because it had four 1024x1024 by - IIRC - 16 bit CCDs. NASA had 25 of the CCDs made at a cost of $25,000,000.00 then selected the very best 4 of the 25 for the Space Telescope. The next 4, modestly inferior, went into a largely identical four-shooter for use on the 200" Hale Telescope at Palomar Mountain Observatory.

    During my time there in the Astro department, a debate arose over whether we should run UNIX on the Astro VAXes. That would enable us to get on the Internet. While the UNIX source code was effectively free of charge, it would have come at the cost of a collossally expensive router. We also would have had to port all of our data analysis code over to UNIX.

    I don't really know but it is quite likely that a FORTRAN compiler was simply unavailable for UNIX back then. None of the Astronomers and very few of the students had the first clue about C. I myself learned C during my sophomore year but it is quite likely that I was the only person in the entire Astronomy department who did know C - and I wasn't very good at it.

    Thus the Astronomy department chose not to run UNIX.

    A few months after I left Caltech, in the Spring of 1985 or so, I interviewed with a self-employed arcade game developer who used a VAX 11/750 with a cross-assembler to write games. He was interested to hire me so I could help him write his own suite of cross-development tools, for use in other applications than games because, as he said "The kids aren't plunking in the quarters like they used to." He was interested in me as I knew quite a lot about software, however it was Computational Physics in FORTRAN. I didn't have much of a clue about compilers, like he was the very first to tell me what a Hash Table was!

    This same fellow asked me if I'd heard about that new chip from Intel - the 80386 - "They say that one microprocessor is about as fast as my VAX 11/780!"

    That was indeed the case - a Compaq 386 with 4 MB of memory running MS-DOS with a memory extender was the rough equivalent of a VAX 11/780. The VAX cost $150,000.00 while the Compaq would only set you back $30,000.00.

    I used Scott's MS-DOS typing game to learn touch typing in 1988. At the time, computer programmers had no real need to learn touch typing. The reason I felt the need to, was that I was concerned that not knowing how to would lead future employers to think that I didn't really know how to program, whereas what I really didn't know, was that I didn't know how to type.

    When my father and I first learned to code in FORTRAN, we used pencils and coding forms - they resembled pieces of graph paper, that were eighty columns across. I don't know but would be completely unsurprised were they 24 lines high!

    Back then, to use a pencil and a coding form was really the only way one _could_ write production software in a commercial environment, as keypunch machines cost ten grand, and skilled, experienced keypunch operators were highly paid, as well as hard to find.

    The difference between me and you is only four years!

    --
    Yes I Have No Bananas. [gofundme.com]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by davester666 on Thursday April 09 2015, @04:47AM

    by davester666 (155) on Thursday April 09 2015, @04:47AM (#168178)

    Not that different. I would write out pages and pages of AppleBasic, assember and machine code [manually converted assembler to machine code] at home to type into the Apple computer at school which I only had access to a couple hours a week.