The average American watches more than five hours of TV per day, but pretty soon that leisure time may be dominated by YouTube and other online video services.
In an address at CES 2016, YouTube's chief business officer Robert Kyncl argued that digital video will be the single biggest way that Americans spend their free time by 2020 – more than watching TV, listening to music, playing video games, or reading.
The amount of time people spend watching TV each day has been pretty steady for a few years now, Mr. Kyncl pointed out, while time spent watching online videos has grown by more than 50 percent each year. Data from media research firm Nielsen shows that it's not just young people watching online videos, either: adults aged 35 to 49 spent 80 percent more time on video sites in 2014 than in 2013, and adults aged 50 to 64 spent 60 percent more time on video sites over the same time period.
Why the shift?
(Score: 2) by curunir_wolf on Wednesday January 13 2016, @02:18PM
I am a crackpot
(Score: 1) by Shimitar on Wednesday January 13 2016, @02:21PM
Ssshh... it's a Javascript "scientist" :)
Coding is an art. No, java is not coding. Yes, i am biased, i know, sorry if this bothers you.
(Score: 2) by tibman on Wednesday January 13 2016, @03:07PM
For those who want to know why. A string is an array of char (a character). Char is a datatype because it is a fixed size in memory, just like int and float and other primitives.
SN won't survive on lurkers alone. Write comments.
(Score: 2) by Pino P on Wednesday January 13 2016, @06:12PM
char (a character)
The data type char does not fully represent a character. In C, it represents a UTF-8 code unit; in Java, it represents a UTF-16 code unit. However, a character won't fit in either of those.
How many characters is é (Latin small letter E with acute)? What about y̾ (Latin small letter Y with combining vertical tilde)? Or 加 (CJK ideogram meaning "add") ? Or 💩 (pile of poo)? Or 💩̾ (pile of poo with combining steam)?
Each of these five is one "grapheme cluster", though all five are more than one UTF-8 code unit, three are more than one UTF-16 code unit, and two are more than one code point (so UTF-32 won't help). See the UTF-8 Everywhere manifesto [utf8everywhere.org] and why Swift's string API is so messed up [mikeash.com] to learn how "character" isn't a data type either.
(Score: 2) by tibman on Wednesday January 13 2016, @07:06PM
A char is typically one byte. Historically, char is short for character and represents an ascii character. Only higher level stuff cares about what a collection of chars represents. UTF-8 is a multi-byte encoding so by definition you cannot pre-allocate space for a UTF-8 character unless you already know what it is. You also cannot know how many bytes a UTF-8 character is unless you decode it. One UTF-8 character may be one byte or it may be four. So there will never be a primitive datatype for UTF-8. When you are talking about UTF-8 then you might as well be talking about strings or some other dynamic structure. I think only UTF-32 could be made into a primitive and a UTF-32 char (4 bytes, fixed) would indeed fully represent any character.
SN won't survive on lurkers alone. Write comments.
(Score: 2) by Pino P on Wednesday January 13 2016, @07:33PM
Each of these five is one "grapheme cluster", though ... two are more than one code point (so UTF-32 won't help).
a UTF-32 char (4 bytes, fixed) would indeed fully represent any character
A UTF-32 code unit does indeed represent any code point. But because not all characters of a script are available precomposed [wikipedia.org], a single grapheme cluster may span more than one code point if it has combining diacritics attached to it. Nor is it very useful to divide a string between the code points that make up a grapheme cluster. That's what I meant to get across by including the examples of y̾ (y with vertical tilde) and 💩̾ (poo with steam): there is no fixed-width data type that can represent all characters.
(Score: 3, Insightful) by tibman on Wednesday January 13 2016, @09:58PM
You are talking about combining characters, yes? You are taking two characters from UTF-32 and combining them: http://www.fileformat.info/info/charset/UTF-32/list.htm [fileformat.info]
Just because two characters occupy the same space on the screen that doesn't make them one character.
This argument is getting silly. A char is a datatype in low-level languages and a string is not. You are arguing with my explanation using historical words outside of their intended context. Historically char was short for character. It is also a good way for a layman to understand what char is. I was only trying to clarify someone else's not so clear remark. You are not helping me with that endeavor and it's pedantry bordering on trolling or something. If the char datatype isn't designed to hold a character then what is it used for?
SN won't survive on lurkers alone. Write comments.
(Score: 2) by Pino P on Thursday January 14 2016, @04:06PM
If the char datatype isn't designed to hold a character then what is it used for?
Let me try to sum up your argument and mine in a manner that addresses the point at hand: The data types called char were originally designed to hold a character back when users of computing were members of cultures whose languages that used few characters. As the number of cultures served by computing has grown, the data types called char have since become insufficient for that purpose.