Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Wednesday October 15 2014, @03:06PM   Printer-friendly
from the considered-harmful dept.

The New York Times has coverage on the phenomenon of Developer Bootcamps, that claim to do in a matter of a couple of months what used to take at least a couple of years for an associate's degree. These cram courses are apparently getting about a 75% job placement rate.

Have any Soylentils either gone through these programs, or worked with others who have? If so, what are your experiences?

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by linuxrocks123 on Wednesday October 15 2014, @07:03PM

    by linuxrocks123 (2557) on Wednesday October 15 2014, @07:03PM (#106358) Journal

    I totally and completely agree with you in general, but I would replace the reason for sorting as general algorithms rather than recursion in specific. The only recursive sort I know off the top of my head is mergesort.

  • (Score: 2) by No.Limit on Wednesday October 15 2014, @07:28PM

    by No.Limit (1965) on Wednesday October 15 2014, @07:28PM (#106368)

    heapsort (for the heap structure)
    quicksort
    radix sort

    but then you probably wouldn't use recursion for bubble sort, insertion sort, selection sort (ofc you're always free to do loops with recursion, but unless you're coding in a functional language that'd be quite forced).

    Though I think sorting isn't really taught for recursion, but rather for complexity theory and also because it's one of the most important problems to solve. Being able to decide which sorting algorithm is good for which situation can be very useful in many situations.

    I believe some C/++ libraries use quick sort for their sorting which is worst case O(n^2), which is way worse than e.g. merge sorts worst case O(n*log(n)). But quick sort is still really good because it's in-situ and average O(n*log(n))

    Recursion can be taught very well with tree structures (binary trees, balanced trees, abstract syntax trees etc) or simply with functional languages.

    • (Score: 1) by ld a, b on Wednesday October 15 2014, @09:58PM

      by ld a, b (2414) on Wednesday October 15 2014, @09:58PM (#106426)

      Complexity theory is undermined by sorting algorithms.
      In the beginning, we had a O(n) algorithm, then we found out a way of sorting in O(n log n) time, finally, we all settled for an O(n^2) algorithm because it is wipes the floor with everything else on average.
      Sounds convincing.

      --
      10 little-endian boys went out to dine, a big-endian carp ate one, and then there were -246.
      • (Score: 2) by maxwell demon on Thursday October 16 2014, @07:00AM

        by maxwell demon (1608) Subscriber Badge on Thursday October 16 2014, @07:00AM (#106549) Journal

        The current state of the art is Introsort. Almost the same speed as Quicksort, but complexity O(n log n).

        Anyway, Quicksort teaches an important lesson about complexity theory as well: Worst case complexity is not the only consideration you should take into account.

        Oh, and what is the O(n) algorithm you are speaking about?

        --
        The Tao of math: The numbers you can count are not the real numbers.
        • (Score: 1) by ld a, b on Thursday October 16 2014, @11:34AM

          by ld a, b (2414) on Thursday October 16 2014, @11:34AM (#106577)

          Sorry to disappoint you if you were hoping for something else, but the answer is radix sort which was developed to sort US census data back when tabulating machines were the bleeding edge.

          --
          10 little-endian boys went out to dine, a big-endian carp ate one, and then there were -246.
          • (Score: 2) by No.Limit on Thursday October 16 2014, @07:13PM

            by No.Limit (1965) on Thursday October 16 2014, @07:13PM (#106768)

            Radix's sort O(n) runtime is quite controversial as it's actually O(n*log(k)) where you can have at most k different elements that must all have a binary representation.

            More on this here [stackoverflow.com]

            For comparison only sorting algorithms (much smaller constraint) O(n*log(n)) is proven to be the best worst case running time.