The average American watches more than five hours of TV per day, but pretty soon that leisure time may be dominated by YouTube and other online video services.
In an address at CES 2016, YouTube's chief business officer Robert Kyncl argued that digital video will be the single biggest way that Americans spend their free time by 2020 – more than watching TV, listening to music, playing video games, or reading.
The amount of time people spend watching TV each day has been pretty steady for a few years now, Mr. Kyncl pointed out, while time spent watching online videos has grown by more than 50 percent each year. Data from media research firm Nielsen shows that it's not just young people watching online videos, either: adults aged 35 to 49 spent 80 percent more time on video sites in 2014 than in 2013, and adults aged 50 to 64 spent 60 percent more time on video sites over the same time period.
Why the shift?
(Score: 2, Funny) by Anonymous Coward on Wednesday January 13 2016, @11:32AM
Haven't looked at a TV since 2007. I liked analog TV better than digital. Digital was shit, so I said fuck TV.
Switched to streaming video on the Internet, and never looked back.
Now I have one of those mobile broadband hotspots, and wouldn't you know it receives digital Internet on what used to be analog TV channel 52.
Came full circle I have.
(Score: 0) by Anonymous Coward on Wednesday January 13 2016, @09:11PM
I'd still like to watch certain sports, but they are rarely offered over the Internet, at least not without expensive all-or-nothing season-long bundling. I HATE FORCED BUNDLING!
Don't these big franchises worry about losing young viewers? There's alternative sports that don't charge an arm and leg (and concussion) to view, and these will swipe sports fans if the old-school franchises don't prepare.
(Score: 1) by Shimitar on Wednesday January 13 2016, @11:53AM
Well... i am not in the USA and i don't own a TV (never have), so i guess my opinion is a bit external to all this.
I think, simply, YouTube and the like are interactive. You watch what you like, when you like it. If TV was like this, maybe, it might survive. Otherwise, nothing beats freedom, right? Choose what to whatch when you want. Free.
Coding is an art. No, java is not coding. Yes, i am biased, i know, sorry if this bothers you.
(Score: 2, Touché) by Anonymous Coward on Wednesday January 13 2016, @12:11PM
Coding is an art. No, java is not coding.
Programming is a science. Coders are not programmers. Don't forget to overdose on cocaine during your next sprint for that deadline, code monkey.
(Score: 1) by Shimitar on Wednesday January 13 2016, @12:20PM
Interesting point...
care to describe your definition of coding and programming?
And i have no deadlines, except those set by my wife.
Coding is an art. No, java is not coding. Yes, i am biased, i know, sorry if this bothers you.
(Score: 1, Interesting) by Anonymous Coward on Wednesday January 13 2016, @12:51PM
Different AC here.
Programming you can do without having a computer. Indeed, you don't even need to know a specific programming language. Try coding without a computer and a programming language.
Programming is about designing data structures and developing algorithms. Coding is about implementing specific tasks in code.
In reality, no one is a pure programmer or a pure coder, but that doesn't mean there's no difference.
(Score: 1) by Shimitar on Wednesday January 13 2016, @01:09PM
I was curious to know why programming should be a science....
Coding is an art. No, java is not coding. Yes, i am biased, i know, sorry if this bothers you.
(Score: 4, Insightful) by VLM on Wednesday January 13 2016, @01:24PM
Cut him some slack, he leads off with
i am not in the USA
My guess is where ever he is, they understand vocational training vs education better than here, because here we have massive propaganda to confuse the issue.
Coding is vocational computer science, kinda like brick laying is vocational civil engineering. Most of the people doing it were trained to act without thinking very hard about it. Coders only work when they're typing. Typical book title would be "Learn Java in 24 hours". Smart coders learn new technology quickly and hold entire blueprints in their head and are good at debugging. Usually coders produce code, hence the name.
Software engineering / computer scientist work is education side of computer science, kinda like civil engineering is the educational side of bricklaying. Lots of theory and math and scalability and design and mostly thinking. Computer scientists only work when they're quietly thinking, or maybe debating with colleagues or reading journal papers. Typical book title would be some automata theory textbook, or a Knuth book (any of them, really). Smart computer scientists invent or refine observations, model things with giant math equations and proofs, invent the languages the coders eventually learn. Usually computer scientists produce journal papers, conference presentations, and enormous amounts of hot air, hence the academic "scientist" past of the name.
There's tons of crossover, plenty of CS can code up a storm, and plenty of coders are also computer scientists due to external economic reasons (massive underemployment, industrial-educational complex produces new XYZ based on ability to pay not market demand/jobs, etc)
(Score: 2, Interesting) by Shimitar on Wednesday January 13 2016, @01:45PM
The fact that i got a CS education all the way doesn't take away the fact it was a consequence of my passion and interests. Never got into for the job, exactly the other way round. Guess i am lucky, since my job is also my passion, and guess the first reply come from somebody a bit less happy with his/her life.
And, by the way, you are perfectly right. I think of university education as a way to improve your overall CULTURE in a specific direction, not a way to get a job or get into a profession.
I don't define my life from my job, nor what i am from my profession.
Coding is an art. No, java is not coding. Yes, i am biased, i know, sorry if this bothers you.
(Score: 2) by VLM on Wednesday January 13 2016, @02:02PM
I don't define my life from my job, nor what i am from my profession.
LOL yeah we already heard :
i am not in the USA
Note that almost everyone in the USA believes in the opposite of your statement. I more or less agree with you, but there's just one of me, for better or worse. The opposite of what you wrote is by far one of our dominant cultural beliefs.
(Score: 2, Interesting) by Shimitar on Wednesday January 13 2016, @02:10PM
I lived in the USA for a bit and while what you say is indeed true, at the same time i met also people who are not like that.
Usually the most interesting ones...
Surely the only ones with i am still in contact today.
Don't be too hard on your fellow countrymen, i could start bitching about mine too. Luckly, good people is evenly distributed. Unluckly, stupid people is too.
Coding is an art. No, java is not coding. Yes, i am biased, i know, sorry if this bothers you.
(Score: 2) by bzipitidoo on Wednesday January 13 2016, @02:09PM
Textbook programming also simplifies by ignoring "irrelevancies". Typical algorithm textbook pseudocode has no worries about types, overflow, array sizes and bounds. heap space, or I/O. The Turing Machine's tape is infinite. And I have found that can lead to errors. For instance, Quicksort is always stated to take time O(n log n) (on average, worse case is O(n^2)), but that makes a big assumption, which is that a single comparison can be done in O(1). Yet string comparison is well known to take O(n) time. How can Quicksort be done in O(n log n) time on strings?
(Score: 2) by curunir_wolf on Wednesday January 13 2016, @02:18PM
I am a crackpot
(Score: 1) by Shimitar on Wednesday January 13 2016, @02:21PM
Ssshh... it's a Javascript "scientist" :)
Coding is an art. No, java is not coding. Yes, i am biased, i know, sorry if this bothers you.
(Score: 2) by tibman on Wednesday January 13 2016, @03:07PM
For those who want to know why. A string is an array of char (a character). Char is a datatype because it is a fixed size in memory, just like int and float and other primitives.
SN won't survive on lurkers alone. Write comments.
(Score: 2) by Pino P on Wednesday January 13 2016, @06:12PM
char (a character)
The data type char does not fully represent a character. In C, it represents a UTF-8 code unit; in Java, it represents a UTF-16 code unit. However, a character won't fit in either of those.
How many characters is é (Latin small letter E with acute)? What about y̾ (Latin small letter Y with combining vertical tilde)? Or 加 (CJK ideogram meaning "add") ? Or 💩 (pile of poo)? Or 💩̾ (pile of poo with combining steam)?
Each of these five is one "grapheme cluster", though all five are more than one UTF-8 code unit, three are more than one UTF-16 code unit, and two are more than one code point (so UTF-32 won't help). See the UTF-8 Everywhere manifesto [utf8everywhere.org] and why Swift's string API is so messed up [mikeash.com] to learn how "character" isn't a data type either.
(Score: 2) by tibman on Wednesday January 13 2016, @07:06PM
A char is typically one byte. Historically, char is short for character and represents an ascii character. Only higher level stuff cares about what a collection of chars represents. UTF-8 is a multi-byte encoding so by definition you cannot pre-allocate space for a UTF-8 character unless you already know what it is. You also cannot know how many bytes a UTF-8 character is unless you decode it. One UTF-8 character may be one byte or it may be four. So there will never be a primitive datatype for UTF-8. When you are talking about UTF-8 then you might as well be talking about strings or some other dynamic structure. I think only UTF-32 could be made into a primitive and a UTF-32 char (4 bytes, fixed) would indeed fully represent any character.
SN won't survive on lurkers alone. Write comments.
(Score: 2) by Pino P on Wednesday January 13 2016, @07:33PM
Each of these five is one "grapheme cluster", though ... two are more than one code point (so UTF-32 won't help).
a UTF-32 char (4 bytes, fixed) would indeed fully represent any character
A UTF-32 code unit does indeed represent any code point. But because not all characters of a script are available precomposed [wikipedia.org], a single grapheme cluster may span more than one code point if it has combining diacritics attached to it. Nor is it very useful to divide a string between the code points that make up a grapheme cluster. That's what I meant to get across by including the examples of y̾ (y with vertical tilde) and 💩̾ (poo with steam): there is no fixed-width data type that can represent all characters.
(Score: 3, Insightful) by tibman on Wednesday January 13 2016, @09:58PM
You are talking about combining characters, yes? You are taking two characters from UTF-32 and combining them: http://www.fileformat.info/info/charset/UTF-32/list.htm [fileformat.info]
Just because two characters occupy the same space on the screen that doesn't make them one character.
This argument is getting silly. A char is a datatype in low-level languages and a string is not. You are arguing with my explanation using historical words outside of their intended context. Historically char was short for character. It is also a good way for a layman to understand what char is. I was only trying to clarify someone else's not so clear remark. You are not helping me with that endeavor and it's pedantry bordering on trolling or something. If the char datatype isn't designed to hold a character then what is it used for?
SN won't survive on lurkers alone. Write comments.
(Score: 2) by Pino P on Thursday January 14 2016, @04:06PM
If the char datatype isn't designed to hold a character then what is it used for?
Let me try to sum up your argument and mine in a manner that addresses the point at hand: The data types called char were originally designed to hold a character back when users of computing were members of cultures whose languages that used few characters. As the number of cultures served by computing has grown, the data types called char have since become insufficient for that purpose.
(Score: 2) by VLM on Wednesday January 13 2016, @02:31PM
Yet string comparison is well known to take O(n) time.
Its O(constant) just traditional to call it 1. On any set of finite strings the comparison will never take more than a constant, that constant based on the max string length or maybe finite sized data type or finite sized machine. The number of strings will have no impact on run time.
The biggest screw up with scalability is not understanding the problem. Best case insertion sort is O(n) given one new entry to add to a pre-sorted set, and quicksort is way worse at O(nlogn) and I got into a huge workplace argument years ago with a guy who apparently thought input has no effect on scalability. See I agreed with him that if you feed a QS a random pile of data its nlogn and insertion is n squared so QS is way faster for random data, but we're not sorting random data we're sorting already sorted data... I may be forgetting some details.
(Score: 2) by VLM on Wednesday January 13 2016, @02:34PM
The number of strings will have no impact on run time.
The runtime of any individual comparison, to be specific. Is X>Y has no runtime impact based on having a million or a trillion other comparisons to check later. Obviously in practical use, today's value of "n" will have impact on a sort's total wall clock time.
(Score: 0) by Anonymous Coward on Wednesday January 13 2016, @02:59PM
Wait, the time to compare two strings in the collection depends on the number of strings in the collection?
If you have a collection of nstrings strings whose average length is ncharacters per string, then the sorting time is O(nstrings log nstrings), but O(ncharacters per string). Note the different variables in the big-O notation. You can change the average length of the strings in your collection independently of the size of the collection. You can have a collection of 20 strings, each a million characters long, or a collection of a million strings, each 20 characters long. While the total number of characters is the same, the sorting time for both will differ dramatically.
(Score: 2) by bzipitidoo on Wednesday January 13 2016, @03:30PM
Yes, you spotted the main trick to that trick question, different n's :).
And yet, the two quantities are not completely unrelated. As the number of strings grows, the match length also grows. Supposing you have an alphabet of 26 letters, and 27 strings to sort. At least 2 of the strings must start with the same letter. With 26^2+1 strings to sort, that grows to 2 matching letters at the start for at least 2 of the strings.
(Score: 2) by vux984 on Wednesday January 13 2016, @06:24PM
You are abusing Order notation; and stating the problem in a confusing manner. Although there is some technical merit to your argument.
Yet string comparison is well known to take O(n) time.
Order notation is to measure asymptotic behavior as a problem size grows.
String comparison is O(n) asymtotically; that is as the number of elements in the strings grows arbitrarily large the time to compare two strings grows linearly. That's what O(n) means. If the strings are constrained to a finite length then they can be considered O(1).
Likewise Qsort is asymtotically O(n log n); that is, as the number of elements to be sorted grows arbitarilty large the time it takes to qsort them grows at the rate of: n log n.
So for all practical purposes Qsort takes O(n log n) even on strings.
However, yes, if you were actually interested in representing the time to quick sort strings where BOTH the size of the strings AND the number of strings to be compared were BOTH allowed to grow unbounded. Then yeah, its O(m * n log n ). Where n is the number of elements and m is is the size of the elements.
(Score: 5, Funny) by Anonymous Coward on Wednesday January 13 2016, @01:20PM
Programming is a science, but trolling is a art.
(Score: 1) by Shimitar on Wednesday January 13 2016, @01:39PM
With your permission, i am going to use this quote, it's just perfect.
Coding is an art. No, java is not coding. Yes, i am biased, i know, sorry if this bothers you.
(Score: 0) by Anonymous Coward on Wednesday January 13 2016, @02:46PM
Go ahead, I didn't come up with 'a art [urbandictionary.com]' anyway.
(Score: 2) by GungnirSniper on Wednesday January 13 2016, @04:21PM
I second that. It should be added to the list of quotes used on the lower right of every page.
Tips for better submissions to help our site grow. [soylentnews.org]
(Score: 2) by broggyr on Wednesday January 13 2016, @05:59PM
You could do it without their permission, since they're AC
Taking things out of context since 1972.
(Score: 2) by isostatic on Wednesday January 13 2016, @11:28PM
http://www.theonion.com/article/area-man-constantly-mentioning-he-doesnt-own-a-tel-429 [theonion.com]
16 years, wow.
(Score: 2) by FatPhil on Wednesday January 13 2016, @12:13PM
I bet that isn't the case. And I'm prepared to wager real money on it.
Whilst TV might be decreasing, 1 or 2 percent per year, it's still the massive majority of the viewing. Digitial video is only growing at a massive rate because it's relatively so small, way less than a tenth of all viewing. Sure, it will continue to grow, but it's got a very long way to go, and 2020 is only 4 years away. So I reckon it's a safe bet.
It's not like the CEO of the company has got a vested interest in exagerating and pumping his own flagship product, or anything.
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 2) by VLM on Wednesday January 13 2016, @01:29PM
The problem with the theory is its all calculus infinitesimals and continuous functions, whereas reality is there's a handful of major pro sports contracts that could drive everything online if NFL Football dumps FOX or whatever and goes youtube or whatever. Also I think some pro sports contracts extend past 2020 already, so you'd need a broken contract, either mutually or massive legal issue.
Another good one would be there's only a couple "major networks" and if one or two go bankrupt in the next recession due to imploding ad revenue or whatever cause, the studios are still going to want to pump out content and they're just gonna sell to amazon or netflix or whatever, rather than close.
Its far more likely to be some insane step function where networks or OTA sports suddenly dies in some new york city boardroom, and the next day everyone uses netflix or whatever.
(Score: 2) by Runaway1956 on Wednesday January 13 2016, @02:51PM
Problem is, I agree with him, so I'm not taking the bet.
I can't remember the last time I turned a television on. I've watched the news, videos, movies, and everything else on the computer for a long, long time now. For years, I would come into the house, and turn the television OFF, because it annoyed me. More recently, the wife has learned that she can watch all her stuff on the computer, so she doesn't even turn the television on now.
But, that boob tube still thrives in many other homes. The major networks still have a huge following. I don't see television going out of style in the next 4 years.
Abortion is the number one killed of children in the United States.
(Score: 0) by Anonymous Coward on Wednesday January 13 2016, @03:39PM
There are two kinds of people in this world: Those that enter a room and turn the television set on and those that enter a room and turn the television set off.
- The Manchurian Candidate
Then there are those who haven't seen "The Manchurian Candidate." [wikipedia.org]
(Score: 1, Funny) by Anonymous Coward on Wednesday January 13 2016, @04:57PM
Probably because they turned the television set off. :-)
(Score: 4, Insightful) by Anonymous Coward on Wednesday January 13 2016, @01:34PM
Bandwidth caps and political corruption will see to that.
(Score: 0) by Anonymous Coward on Wednesday January 13 2016, @05:45PM
But sir, you can have all the bandwidth to Youtube you like, here's a nice little contract to sign up for Google Fibre.
And with everyone in our search bubble/echo room, we can now truly control what you watch, think and say! Oh, you thought that you really get to decide what you watch when browsing youtube? Heck no, you are presented with stuff we have decided we would like you to watch!
Keep consuming, pleb!
(Score: 3, Insightful) by Celestial on Wednesday January 13 2016, @05:54PM
Bingo. I live in a condominium, and the condominium association has an agreement with Comcast cable. Thus, they are the only Internet provider option I have. I can't even get DSL as I live too far from the nearest telco office. Anyway, Comcast is starting to enforce their monthly data caps of 300 GB, which you can exceed for $10 per additional 10 GB. How generous of them. With the astronomical size of 4K with HDR movie and television content, the only real option I have to watch it is by renting or purchasing and viewing Ultra HD Blu-Ray discs.
(Score: 2) by Bobs on Wednesday January 13 2016, @01:50PM
Taking an existing trend and projecting it out for years and assuming it will not change can lead to all sorts of wrong conclusions. Especially using statistics.
“In the space of one hundred and seventy six years the Lower Mississippi has shortened itself two hundred and forty-two miles. That is an average of a trifle over a mile and a third per year. Therefore, any calm person, who is not blind or idiotic, can see that in the Old Oölitic Silurian Period, just a million years ago next November, the Lower Mississippi was upwards of one million three hundred thousand miles long, and stuck out over the Gulf of Mexico like a fishing-pole. And by the same token any person can see that seven hundred and forty-two years from now the Lower Mississippi will be only a mile and three-quarters long, and Cairo [Illinois] and New Orleans will have joined their streets together and be plodding comfortably along under a single mayor and a mutual board of aldermen. There is something fascinating about science. One gets such wholesale returns of conjecture out of such a trifling investment of fact.” - Mark Twain, Life on the Mississippi (1884)
(Score: 2) by Ken_g6 on Wednesday January 13 2016, @03:00PM
Good point, except you forgot the obligatory XKCD. [xkcd.com]
Not spending enough time online watching videos and stuff?
(Score: 0) by Anonymous Coward on Wednesday January 13 2016, @03:08PM
My favourite example was once given in a talk (unfortunately I don't remember by whom). It was a nice exponentially growing curve showing the number of horse carriages in a certain time frame. Clearly a sign that the horse carriage industry would look into a bright future at that point. Then the speaker added the following years to the curve. It was going rapidly down quite soon. What happened? Well, the car happened.
(Score: 5, Insightful) by VLM on Wednesday January 13 2016, @01:57PM
The average American watches more than five hours of TV per day
Those numbers are meaningless because the curve is less bell shaped than even income and wealth. The actual curve is log or exponential (I can't remember) and the half point is somewhat over one hour, and then that curve is merged with vegetables (diagnosed or otherwise) who lay in hospital beds or on couches and "watch" 16-24 hours per day every day, because they're basically dying. Think like nursing home dayrooms where hundreds of people that can't stand up anymore stare at a TV for 16 hours per day. Generally pretty sad stuff. You can have a lot of fun by asking for the two separate curves of under 6 hours and over 6 hours. Also for fun ask for the demographics, over 6 hours there's almost nobody over $15K/yr and mostly 60+ yrs old, basically FOX news viewers LOL.
One aspect carefully not discussed is the social / cultural effect of atomization. So in the 70s everybody watched MASH or whatever, and it was a shared cultural experience. You go to work and everyone LOLs at the Radar Oreilly or whatever. However the decline has been sharp enough that only say 9 million people watch my wife's favorite show, "the amazing race". For folks who never seen it, its the usual reality/gameshow format but for world travel, and she's a total travel tart, so after she memorized every Rick Steves video she has to watch "something" so thats it. Now the atomization effect is only 9 mil out of 320 mil people watch the race. So about 98% of the population or 49 in 50 won't watch this free show. Personally I don't like the show. Anyway the relevant point is your close human tribe is small enough that she almost certainly doesn't hang out with any fans. TV is "bowling alone" now, whereas in the old days TV was a shared cultural experience, supposedly 1/3 the living human population of the country watched the last episode of MASH together... Those days are gone and are not coming back.
Anyway atomization is important because when I'm too tired to chill by playing modded minecraft, I'll watch my favorite youtube lets play guy, direwolf20. I've watched all his series and gotten all kinds of interesting ideas I've applied to my own play. Also he's just kinda fun, and when he's in a group or team of players its all LOL. So I'm a fan, and he's officially VLM recommended. Anyway about half a million people watch him along with me, some fans probably here. Now that sounds horrific compared to the stats for The Amazing Race at twenty times higher but it turns out not to matter at all. See there are so few fans of the race that my wife is a lonely viewer all by herself, and I am too. When there is no shared culture anymore, there is no stigma against not participating in the non-existent shared culture. Something else to think about is DW20 is one dude and a low production cost, much less than 1/20th what goes into the amazing race... economically they're doomed in the long run. The revenue is probably a bit more than 20x for the race, but the cost of revenue must be 100 to 10000 times as high. DW20s profit margin must be insane.
So atomization means there's metastability at 20% of the population viewing and up, and also basically at 0% viewing any individual video.
That metastability and long term trends means the dinosaurs who still operate thinking 1/3 of the population will watch MASH 2.0 together are going to get absolutely crushed when basically all of the population abandons them.
Its already happened with kids. In my generation my sister and I watched the crap Disney and Nickelodeon and MTV shoveled, more or less. My kids watch youtube based on recommendations from their facebook friends. My kids don't watch cable, not even MTV. Not because I force them off the TV (although I probably should, given the cultural garbage those networks shovel now) but because the school district issues ipads and TV networks are dead technology, like 8-tracks.
(Score: 2) by Snotnose on Wednesday January 13 2016, @07:31PM
The average American watches more than five hours of TV per day
How do they know? Like most everyone else my cable box is plugged into my TV. Like most everyone else so are my PS3, Roku, and DVD player. Even when the TV is set to one of these other inputs as far as the cable box knows I'm watching TV.
But, you say. When you watch TV you flip channels, pause, fast forward, rewind, etc. Being the snot that I am, occasionally while playing Borderlands I'll reach over to my cable remote and diddle it a bit. Why do I do this? I have no idea, but I get a sense of satisfaction when I do.
Relationship status: Available for curbside pickup.
(Score: 2) by isostatic on Thursday January 14 2016, @12:11AM
One aspect carefully not discussed is the social / cultural effect of atomization.
This is something that reality TV gets stuck into. People watch it as a "wallpaper" type event, but it seems the families to seem to cluster around to watch Strictly Dancing or X-Factor or a Baking show or whatever, root for "their" favourites week after week, and tune in to watch it live.
However as Mark Gatiss rightly states:
http://www.radiotimes.com/news/2015-11-09/mark-gatiss-overnight-figures-are-insane--people-will-be-watching-doctor-who-long-after-bake-off-is-over [radiotimes.com]
But Gatiss warns against comparing the "temporary popularity" of shows like Bake Off and The X Factor with a show with a "proper legacy" such as Doctor Who.
"Those episodes of Bake Off or The X Factor, and their virtues are manifest, will never be watched again. Yet Doctor Who will be watched in 50 years' time, 100 years' time. It's a marathon, not a sprint. I love things to be popular, I want things to be watched, but this sort of scrutiny is deadly."
"Better TV", dramas, comedys, etc, people watch when they want to. Dr Who comes out in the UK on a Saturday at some point. Rarely the same time, nobody really cares, and none of my peers know what time it's on, it just comes on at "some point" during Saturday evening, and they've watched it by Monday morning for watercooler chatting. At least when it was still good.
A mid-season episode had the following ratings
"The Zygon Invasion 3.87m (overnight) 5.76m (final) 6.49m (L+7) AI 82"
That's 3.87m watching it Saturday night, 1.89m recording it off air and watching it within 7 days, and 730,000 watching it on iplayer.
Sherlock "The Abominable Bride" was the winner of the Christmas ratings, pulling in a total of 11.6m viewers. Almost everyone in my peer group watched it (despite 4 in 5 people in the UK not watching it). 8.4m watched it the day it was broadcast, 3.2m (and most of my peer group fit into this) watched it later.
The trends are pushing against schedulers. And that's about time. If producers want that shared culture to drive viewers, they need to not only release globally at the same time, but also build a show that people want to talk about.
(Score: 4, Interesting) by Webweasel on Wednesday January 13 2016, @02:42PM
This is pretty much the case in my household.
My kids game and watch youtube and skype at the same time!
So much so I had to have another phone line fitted, so I could use netflix while they are active.
They don't recognise famous people from television, but they for sure know who pewdipie is.
Southpark totally nailed this with their death of the living room/cartmanbra/pewdepie episodes at the end of last years season. If you have not watched it you should, they have their finger on the pulse of society better than anything else on broadcast media right now.
OFC, I watched from a streaming site, I gave up any tunable TV devices and my TV licence about 3 years ago now.
Now did I make the move because I was streaming more? No... I jumped because of advertising. No, not jumped. Pushed. Advertising pushed me away from TV.
Priyom.org Number stations, Russian Military radio. "You are a bad, bad man. Do you have any other virtues?"-Runaway1956
(Score: 4, Interesting) by GungnirSniper on Wednesday January 13 2016, @04:52PM
At least one television executive is planning on cutting commercials. [adweek.com]
Tips for better submissions to help our site grow. [soylentnews.org]
(Score: 1) by eliphas_levy on Thursday January 14 2016, @12:42AM
Well, they had better to anyway. TNT is about 1/3rd content, 2/3rd ads. This is why I watch them (if ever) after recording the movie I was interested in.
Not that I did it *once* in the last year, though... As the movies are always old, downloading them or looking in my recorded folder I almost every time find the thing that is going on live there.
This is a sigh.
(Score: 0) by Anonymous Coward on Wednesday January 13 2016, @03:11PM
I'd bet quite a bit of "watching" online videos is actually listening to music, while doing something else (look at the number of music videos on YouTube!). That would not be in competition with most TV uses, but more with radio stations (and dedicated online streaming services).
(Score: 0) by Anonymous Coward on Wednesday January 13 2016, @05:51PM
Remember MTV? From back in the day that the M stood for music?
(Score: 0) by Anonymous Coward on Thursday January 14 2016, @10:05AM
I explicitly wrote: "most TV uses." And the fact that you had to go to the past to find a counter example only strengthens my point: Which TV station specializes in playing music today?
(Score: 2) by Dunbal on Wednesday January 13 2016, @03:16PM
Yeah no I don't think Youtube is solely responsible for the "cord cutting" phenomenon, no matter how much Google would like you to think so. The blame belongs elsewhere [roku.com].
(Score: 1, Insightful) by Anonymous Coward on Wednesday January 13 2016, @03:50PM
I think Roku is a player in that. But DVR was/is a big factor too.
People do not watch TV like they used to. We were trying to do this back when VHS was a thing. Storage just was not up to the task (speed and cost). Fast forward was bad too. It is very good now. We want to timeshift what we do to our convenient time to watch. Youtube/Roku/Tivo and services like it let you do this.
I know several people who have not cut the cord. But only because they want to watch 'the game'. They do not watch them live. They use the DVR and timeshift out the commercials. They will come in when the game starts pause it and then come back 1-2 hours later and start watching it. Or just straight up record it and skip all the boring junk.
I watched the progression of my coworkers. It was 'did you see x last night' to 'got a tivo' to 'have not watched it yet I have it recorded' to 'I got rid of cable' to 'I mostly use blah' where blah is something like youtube/roku/hulu/netflix/amazon/torrent.
There is no 'one big thing'. It is a bunch of techs all coming together to create the a-la-carte everyone has wanted. Mostly storage and network speeds. Roku is solving an interesting fragmentation problem. Where the copyright holders are trying to squeeze a bit more out of the lemon. They do that by shopping the catalogs around thru the different services. All in one devices like Roku glue it all back together. I personally use KODI. The copyright holders are going to find individual shows have less and less value. Because they will be competing with all of the existence of TV plus anything new. It is the same problem music copyright conglomerates are having. Yet they are trying to charge more and more. So services like youtube are filling in the other end.
I figure cable will move to a time window sort of mass scheduling and people schedule it to get recorded into their DVR. Or just straight up broadcast IP TV with massive amounts of bandwidth and a-la-carte. They will have to. The 'never corders' are coming. They probably have a good 15-20 years to get with it. But after that it will be massive red on the balance sheets if they dont.
(Score: 0) by Anonymous Coward on Thursday January 14 2016, @10:12AM
Actually I'd say a lot of the blame goes here. [wikipedia.org]
If there were no (mostly) ad-free TV stations in Germany, I'd probably rarely watch TV, too (I already avoid private TV stations, and public TV stations at the times when they are allowed to display ads; something has to be very interesting to make an exception for that — and then, I'll probably record and watch later, allowing me to skip ads).
(Score: 3, Insightful) by Zinho on Wednesday January 13 2016, @05:31PM
Unfortunately for the TV networks, what they're offering isn't what we want. I'm not sure it's ever been what we really wanted, just what was available.
At the turn of the millenium (1999-2000) Qwest Communications ran an ad that showed they understood what we want: ". . . every movie ever made in any language anytime, day or night." * [youtube.com] So far even the Internet has failed to provide that for us, despite Bittorrent's best efforts. Youtube is getting close, despite the Studios' best efforts to neuter it.
It seems to me that we have the technology to provide what we want, but not the business model - likely because the price point the public is willing to pay is far below what generated all the revenue for Hollywood and cable companies for so many years.
* really, I'd like to see TV episodes, live performances (concerts, theater, etc) and news reports added to that list as well. There's no reason why anything that's been recorded since the advent of digital video can't be archived and made available on the 'Net besides the cost/benefit ratio for the copyright owner. I'd love to see the Library of Congress taking on a role as central repository for that sort of thing and stream their archives to the public.
"Space Exploration is not endless circles in low earth orbit." -Buzz Aldrin
(Score: 3, Interesting) by isostatic on Wednesday January 13 2016, @11:26PM
It seems to me that we have the technology to provide what we want, but not the business model - likely because the price point the public is willing to pay is far below what generated all the revenue for Hollywood and cable companies for so many years.
http://techcrunch.com/2015/01/20/2015-ad-spend-rises-to-187b-digital-inches-closer-to-one-third-of-it/ [techcrunch.com]
Puts TV Advertising spend at $79bn, $250 per person, or about $670 per household - $55 per month per household. That means on average a US household pays $55 to watch TV, in addition to their various cable subscriptions. (By comparison the TV license in the UK, which fund the BBC, is $210 per household, or $17.50 per month. )
If the average person watches 1 hour a day, or 360 hours a year, and 1/3rd of that is adverts, that puts the average worth of time at something like $2 per hour.
(Score: 2) by isostatic on Wednesday January 13 2016, @11:53PM
Television? That particular form of entertainment did not last much beyond the year 2040
https://www.youtube.com/watch?v=5Hkj2tOJ6bk [youtube.com]
I suspect a rare case where Trek may be accurate in it's prediction, and it's not something I'd have guessed even in the last 90s (while I was watching new Voyager and SG1 episodes by downloading them over a 56k modem)
Of course there's a question of what is TV? Sure, watch american idol live on ABC on a Saturday night and that's TV. What if you time-shift it by a few days? What if you watch a 45 minute TV drama live? What if that drama is time-shifted? What if it's the same format but made for netflix? Is "Man in the High Castle" TV? How about "Gotham"? Is Jeremy Clarkson's new show "TV"?
On youtube, 6 million people have watched a 20 minute program which analyzes a trailer for a film [youtube.com]. On multiple occasions [youtube.com].
20 minutes is a similar length to a half-hour TV program minue the adverts. 6 million is a similar amount to those that tune into the Simpsons each week (although in the youtube case that is global audience, so probably only 1-2 million for US)
Why are the Simpsons (20 minutes, 6 million viewers) a TV episode, but these youtube videos (20 minutes, 6 million viewers) not?
It seems the popular youtube videos tend to be either based on other works (films, games, etc), or real life. There seems to be little original drama that pulls millions of viewers on a regular basis.
(Score: 2) by cafebabe on Thursday January 14 2016, @02:33AM
YouTube is becoming Network 23 [wikipedia.org].
1702845791×2