Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Wednesday January 13 2016, @11:23AM   Printer-friendly
from the cord-cutters-ftw dept.

The average American watches more than five hours of TV per day, but pretty soon that leisure time may be dominated by YouTube and other online video services.

In an address at CES 2016, YouTube's chief business officer Robert Kyncl argued that digital video will be the single biggest way that Americans spend their free time by 2020 – more than watching TV, listening to music, playing video games, or reading.

The amount of time people spend watching TV each day has been pretty steady for a few years now, Mr. Kyncl pointed out, while time spent watching online videos has grown by more than 50 percent each year. Data from media research firm Nielsen shows that it's not just young people watching online videos, either: adults aged 35 to 49 spent 80 percent more time on video sites in 2014 than in 2013, and adults aged 50 to 64 spent 60 percent more time on video sites over the same time period.

Why the shift?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by tibman on Wednesday January 13 2016, @07:06PM

    by tibman (134) Subscriber Badge on Wednesday January 13 2016, @07:06PM (#289206)

    A char is typically one byte. Historically, char is short for character and represents an ascii character. Only higher level stuff cares about what a collection of chars represents. UTF-8 is a multi-byte encoding so by definition you cannot pre-allocate space for a UTF-8 character unless you already know what it is. You also cannot know how many bytes a UTF-8 character is unless you decode it. One UTF-8 character may be one byte or it may be four. So there will never be a primitive datatype for UTF-8. When you are talking about UTF-8 then you might as well be talking about strings or some other dynamic structure. I think only UTF-32 could be made into a primitive and a UTF-32 char (4 bytes, fixed) would indeed fully represent any character.

    --
    SN won't survive on lurkers alone. Write comments.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Pino P on Wednesday January 13 2016, @07:33PM

    by Pino P (4721) on Wednesday January 13 2016, @07:33PM (#289229) Journal

    Each of these five is one "grapheme cluster", though ... two are more than one code point (so UTF-32 won't help).

    a UTF-32 char (4 bytes, fixed) would indeed fully represent any character

    A UTF-32 code unit does indeed represent any code point. But because not all characters of a script are available precomposed [wikipedia.org], a single grapheme cluster may span more than one code point if it has combining diacritics attached to it. Nor is it very useful to divide a string between the code points that make up a grapheme cluster. That's what I meant to get across by including the examples of y̾ (y with vertical tilde) and 💩̾ (poo with steam): there is no fixed-width data type that can represent all characters.

    • (Score: 3, Insightful) by tibman on Wednesday January 13 2016, @09:58PM

      by tibman (134) Subscriber Badge on Wednesday January 13 2016, @09:58PM (#289287)

      You are talking about combining characters, yes? You are taking two characters from UTF-32 and combining them: http://www.fileformat.info/info/charset/UTF-32/list.htm [fileformat.info]
      Just because two characters occupy the same space on the screen that doesn't make them one character.

      This argument is getting silly. A char is a datatype in low-level languages and a string is not. You are arguing with my explanation using historical words outside of their intended context. Historically char was short for character. It is also a good way for a layman to understand what char is. I was only trying to clarify someone else's not so clear remark. You are not helping me with that endeavor and it's pedantry bordering on trolling or something. If the char datatype isn't designed to hold a character then what is it used for?

      --
      SN won't survive on lurkers alone. Write comments.
      • (Score: 2) by Pino P on Thursday January 14 2016, @04:06PM

        by Pino P (4721) on Thursday January 14 2016, @04:06PM (#289533) Journal

        If the char datatype isn't designed to hold a character then what is it used for?

        Let me try to sum up your argument and mine in a manner that addresses the point at hand: The data types called char were originally designed to hold a character back when users of computing were members of cultures whose languages that used few characters. As the number of cultures served by computing has grown, the data types called char have since become insufficient for that purpose.