Google continues to ram G+ into everything



  • @dhromed said:

     I actually quit tea a while ago. I drink mostly tap water.

     

    I used to drink plain tap water, and then found out I could flavor it by heating the water and passing it through some roasted ground beans, or by soaking dried, ground leaves in the heated water.

     



  • @dhromed said:

    A kettle that runs on electricity instead of being put on your stove, which is a kettle.

    Like an electric kettle?

    They aren't very popular in the US, since our 110 volt power system takes a long time to heat up water. Usually instead, Americans who need instant access to hot water will install one of those mini-water heaters in one of their sinks. Or Pyrex in the microwave is pretty fast.

    EDIT: That said it is weird that if someone says, "I have a problem with my Toyota", that's fine in my brain-thinker. But if they say, "I have a problem with my Adobe", it gets my brain-thinker in annoyance-mode. Not sure why that is.



  • @dhromed said:

    If I'm not invested in the technology, like, not at all, I just use it as an single-purpose appliance, then is it strange that I do not remember all these brand/company/software names?

    I've had similar thoughts lately about the progression of technology, specifically Macs and just in general, tablets.  Back when Windows 95 came out, it seemed we were on this trajectory where people were getting involved with computers and getting smarter and more logical, then the masses started getting weary of all this knowledge and demanded something easier.  So the Mac started its ascent.  Amazingly, people didn't even want this level of knowledge, so tablets started their dominance.

    I feel that we almost had something, but it got lost and now we're all being dragged down by the weight of ignorance and apathy. 

     



  • @anachostic said:

    Back when Windows 95 came out, it seemed we were on this trajectory where people were getting involved with computers and getting smarter and more logical, then the masses started getting weary of all this knowledge and demanded something easier. So the Mac started its ascent. Amazingly, people didn't even want this level of knowledge, so tablets started their dominance.

    Macs got more popular when they got taken over by the NeXT guys and became much more difficult-to-use, though, which kind of pisses all over your theory. Frankly, I don't know why Macs are trendy right now, but at least part of the cause is that they started making really excellent laptops when most PC makers were still shitting-out plastic piece of shit laptops.

    @anachostic said:

    I feel that we almost had something, but it got lost and now we're all being dragged down by the weight of ignorance and apathy.

    We're being dragged-down but ignorance and apathy, but not the user's. We're being dragged-down by the average software developer's hatred of change. How can you expect software to get better when, whenever some software project make a change no matter how small, the developers themselves are the ones decrying it? How can software evolve in an environment where people will throw 20-page bitching sessions because Chrome removes the "http://" from the URL bar? How can you trust developers who hate the Ribbon interface, an interface that's vastly superior in every way to the pre-existing one, to improve the state-of-the-art?

    Hell half the popularity of Apple right now among developers is that their OS works exactly like Unix, an OS from the 70s. People cite that as a selling point! It's no wonder there's no (general) progress being made.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    We're being dragged-down by the average software developer's hatred of change.
    You could just see that as an opportunity to seize market share…?



  • @blakeyrat said:

    We're being dragged-down but ignorance and apathy, but not the user's. We're being dragged-down by the average software developer's hatred of change. How can you expect software to get better when, whenever some software project make a change no matter how small, the developers themselves are the ones decrying it? How can software evolve in an environment where people will throw 20-page bitching sessions because Chrome removes the "http://" from the URL bar? How can you trust developers who hate the Ribbon interface, an interface that's vastly superior in every way to the pre-existing one, to improve the state-of-the-art?

    Hell half the popularity of Apple right now among developers is that their OS works exactly like Unix, an OS from the 70s. People cite that as a selling point! It's no wonder there's no (general) progress being made.

     I won't really argue that, since I'll admit I resist change on certain things.  I guess I'm just saddened that the main population is ok with "good enough" instead of rising to the challenge of learning something more powerful. 

    In much the same way that I don't want to have to deal with configuration via a text editor (i.e. Linux), mainstream users don't want to deal with configuration at all.  In fact, they don't even want to deal with cursors any more.  They just want to use their fingers to point at something, like chimpanzees. In one sense, I see this as progress, because the computer does so much more and a user expends so little effort.  Why not?  Isn't that the point of computers?  But in another sense, I feel we're allowing people to remain dumber than they have to be.

     


  • Discourse touched me in a no-no place

    @blakeyrat said:

    our 110 volt power system takes a long time to heat up water
    What sort of current are domestic devices limited to normally? (Excluding things that have to be wired in specially like driers.) European devices are typically limited to 3kW, which is a ~13A draw.



  •  outlets are rated at 15-20Amp aka 1.6-2.2kW



  • @anachostic said:

    I guess I'm just saddened that the main population is ok with "good enough" instead of rising to the challenge of learning something more powerful. 
     

    More options is not equal to more power. Power is the ability to do something. A command line, for instance, is powerful because it has all the options in a standard syntax and if you rote learn the commands, then like programming itself, it is insanely powerful.

    That power falls away instantly if you cross consoles or programming languages. You don't know this new console. You don't know this new language. Then, even though it still offers all of its options, it's no longer powerful.

    What some falsely decry as "dumbing down" is in fact granting power. GUIs can grant power. Discoverability grants power, even if you reduce the avialiable options (which is not necessarily so).

    This is why it's so obvious that the Windows 8 Start Screen (yea, my pet example!) is such a prime example of true dumbing down: it removes options and removes power. The Start Screen's catch phrase should have been "You can now more easily do...", when instead it is "You can no longer."


  • Considered Harmful

    @dhromed said:

    GUIs can grant power. Discoverability grants power, even if you reduce the avialiable options (which is not necessarily so).

    Fair enough, and also perhaps the reason many developers view them poorly. They already have the power of the CLI at their disposal, so their own power is effectively diminished by switching to a GUI [blakeybait]; however, they are atypical in that respect. Most users gain power from a GUI.



  • @Medezark said:

    I used to drink plain tap water, and then found out I could flavor it by heating the water and soaking dried, ground leaves in the heated water.
     

    Point taken, but I became put off by its long-lasting aftertaste and stopped enjoying its preparation ritual.

    I want hydration -> I gulp down a cup and get back to what I was doing.


  • ♿ (Parody)

    @dhromed said:

    That power falls away instantly if you cross consoles or programming languages. You don't know this new console. You don't know this new language. Then, even though it still offers all of its options, it's no longer powerful.

    What some falsely decry as "dumbing down" is in fact granting power. GUIs can grant power. Discoverability grants power, even if you reduce the avialiable options (which is not necessarily so).

    Don't swallow everything that blakey spews about CLIs. Good ones do have some discoverability. And bad GUIs can be terrible with discoverability.



  • @joe.edwards said:

    so their own power is effectively diminished by switching to a GUI
     

    Hey, it happens to me too. I sometimes start things in code when a program is available, because I happen to have the ability to mold the code and output to my liking.



  • @boomzilla said:

    CLIs. Good ones do have some discoverability.
     

    Totally. My rhetorical mistake there was to put "discoverability" and "reduce options" in the same sentence. The point was that reducing the (immediately obvious) options can grant power.



  • @blakeyrat said:

    That said it is weird that if someone says, "I have a problem with my Toyota", that's fine in my brain-thinker. But if they say, "I have a problem with my Adobe", it gets my brain-thinker in annoyance-mode. Not sure why that is.
     

    Probably because Toyota makes cars, while Adobe makes 100 things for 100 purposes.

    Saying there's a problem with your Adobe is saying there's a problem with your Phillips.

     

    ...

    Imagine someone says they have a problem with the noise of their Hitachi.  :O

     

    I assume they're talking about the hard drive.

     

     

     

     

    Not the construction vehicles.

     

     

     

     

     

     

     

     

     

    Or the vibrators.



  • @dkf said:

    European devices are typically limited to 3kW, which is a ~13A draw.

    13 amp would be a pretty big draw on a US circuit, usually being 15 or 20 amp breakers. It could be done, but God help anything else on the same circuit.

    A quick Google shows US electric kettles are usually rated at 2kW.



  • @dkf said:

    @blakeyrat said:
    We're being dragged-down by the average software developer's hatred of change.
    You could just see that as an opportunity to seize market share…?

    I do, but I lack capital to create a software company and I'm way too slack to quit and go out and find investors when I already have a comfy job.

    If someone wants to just hand me a few million dollars, please.



  • @anachostic said:

     I won't really argue that, since I'll admit I resist change on certain things. I guess I'm just saddened that the main population is ok with "good enough" instead of rising to the challenge of learning something more powerful.

    And let me guess, you think the creaky old 1970s Unix CLI bullshit equates to "more powerful".

    Even if you believed a CLI interface was "more powerful" than a GUI, you have to admit that 20-year-old CLI with tons of warts and misfeatures isn't ideal. (Which isn't to say, that 20-year-old GUIs don't have warts and misfeatures. Which I'm sure some idiot was just about to hit "Reply" and point out.)

    @anachostic said:

    In one sense, I see this as progress, because the computer does so much more and a user expends so little effort. Why not? Isn't that the point of computers? But inanother sense, I feel we're allowing people to remain dumber than they have to be.

    I'm waiting with bated breath to see your evidence that simplified computer interfaces makes people "dumber".



  • @joe.edwards said:

    They already have the power of the CLI at their disposal,

    I'm a developer and I don't use a CLI unless I absolutely have to.

    Partly because I genuinely believe a CLI does *not* make me more efficient in the general case. But mostly because I think it keeps me closer to my users, and helps me to write better software as a result. And super-mostly because I really suck at CLIs because I don't have the rote memorization skills required to use one and sometimes I have trouble typing strings of text precisely.



  •  @blakeyrat said:

    And let me guess, you think the creaky old 1970s Unix CLI bullshit equates to "more powerful".

    Even if you believed a CLI interface was "more powerful" than a GUI, you have to admit that 20-year-old CLI with tons of warts and misfeatures isn't ideal. (Which isn't to say, that 20-year-old GUIs don't have warts and misfeatures. Which I'm sure some idiot was just about to hit "Reply" and point out.)

    I'm waiting with bated breath to see your evidence that simplified computer interfaces makes people "dumber".

    I don't think CLI's are more powerful.  They are more efficient for specific types of work (which is a form of power).  As far as usability, Auto-complete was a godsend, and if  there is somehow an intellisense feature in the future, Discoverability will be covered, too.

    Simplified UIs don't exactly make people "dumber", but they prevent people from getting "smarter".  There's plenty of research that says that beginners don't stay beginners for very long, and at the same time, there are only a few who feel limited by a moderate level of complexity.  To go back to my original post, I feel like we almost had a chance to bring people into this middle-level of competency, but tech makers catered to the lowest-level consumer and stifled the progress.

    Touch interfaces coupled with modern UX design have very low discoverability.  The modern "flat" design removed a lot of affordability by identifying a button as a raised object.  This required a mouse hovering action to discover if the item was clickable.  With touch, the hovering concept is removed.  Don't even get me started on the "discoverability" of edge swiping.  So if anything, simplified UIs are worse now because they require an even more limited function set.

     (I'm a little bummed I've hijacked the discussion to be about usability instead of the original concept of having Internet "portals", like the AOL of old.)

     



  • @blakeyrat said:

    @joe.edwards said:
    They already have the power of the CLI at their disposal,

    I'm a developer and I don't use a CLI unless I absolutely have to.

    Partly because I genuinely believe a CLI does *not* make me more efficient in the general case. But mostly because I think it keeps me closer to my users, and helps me to write better software as a result. And super-mostly because I really suck at CLIs because I don't have the rote memorization skills required to use one and sometimes I have trouble typing strings of text precisely.

    Do I have to step in a be the voice of reason again? Can we all just please agree that whatever tools you find to be the most useful are the ones you should use and not get into a ridiculous dick-measuring contest about whose opinions are "more right" than the other? Rehashing this shit for the nth time is boring.



  • @blakeyrat said:

    Like an electric kettle?

    They aren't very popular in the US, since our 110 volt power system takes a long time to heat up water. Usually instead, Americans who need instant access to hot water will install one of those mini-water heaters in one of their sinks. Or Pyrex in the microwave is pretty fast.

    YMMV, of course, but the electric kettle I use every morning heats water noticeably faster than a microwave. (Yes, I'm sufficiently "invested" in it that I remember the brand.) It is slower than a mini-water heater (basically instant), but making tea properly requires boiling water, and no mini-water produces boiling water<rant class="stupid US tort law"> because somebody might burn him/her/itself and sue the manufacturer</rant>. Brining 2 cups (~500ml) of water to a full boil in the microwave takes between 2 and 3 minutes; the electric kettle takes a little more than half that.

     



  • @mikeTheLiar said:

    Rehashing this shit for the nth time is boring.
    I don't really care who is right or wrong in most of the arguments I read on here, as long as the argument is interesting. Boring is the worst of the worst.



  • @mikeTheLiar said:

    Do I have to step in a be the voice of reason again?

    No you do not.

    @mikeTheLiar said:

    Can we all just please agree that whatever tools you find to be the most useful are the ones you should use and not get into a ridiculous dick-measuring contest about whose opinions are "more right" than the other?

    The problem is:

    1) I frequently don't get a choice in what tools I have to use at work. At my last job, I was hired on the basis that I'd be able to start the programming environment from scratch, but not two months went by before I was using PHP, JSP, Ruby and Git*.

    That wouldn't be an issue except 2) A lot of development tools are completely unfinished. Ruby and Git both being perfect examples of this. Both of those are the result of some open source group writing 1/3rd of a solution, then shipping it as if it were complete. Great, Git. You have a nice server. Now where's the GUI? Awesome programming language and compiler, Ruby. But, uh, where's your IDE? Graphical debugger? You're not DONE yet guys!

    *) BTW usually I get pulled into these tools because the developers who picked PHP, JSP, Ruby and Git in the first place couldn't find their ass with both hands, and the C# user had to swoop down from upon high and fix their broken shit.

    @mikeTheLiar said:

    Rehashing this shit for the nth time is boring.

    You're not the dictator of this forum.



  • @blakeyrat said:

    They aren't very popular in the US, since our 110 volt power system takes a long time to heat up water. Usually instead, Americans who need instant access to hot water will install one of those mini-water heaters in one of their sinks. Or Pyrex in the microwave is pretty fast.
    To what temperature do such water heaters heat water? Because the ones we use in kitchens (5-10l) are typically set to heat to 50-60°C, which is nowhere near enough for tea or coffee.
    @blakeyrat said:
    13 amp would be a pretty big draw on a US circuit, usually being 15 or 20 amp breakers. It could be done, but God help anything else on the same circuit.
    Schuko outlets and connectors are rated for 16A, and in modern installations they're often on their own breakers, allowing you to use the full potential if you wish so (however, unless the outlet is expected to be used for such high-powered machinery, it's not unusual to have a 10A breaker on it).
    @dhromed said:
    Or the vibrators.
    I'd assume it was their Hi-Fi (but only because my grandfather has one, and I played with it a lot as a child).


  • BINNED

    @anachostic said:

    But in another sense, I feel we're allowing people to remain dumber than they have to be.
    Haven't you heard that we should all be able to do arbitrarily complex tasks without having to learn anything?


  • Considered Harmful

    @anachostic said:

    The modern "flat" design removed a lot of affordability by identifying a button as a raised object.  This required a mouse hovering action to discover if the item was clickable.

    It's possible to do flat design right.



  • @joe.edwards said:

    @anachostic said:
    The modern "flat" design removed a lot of affordability by identifying a button as a raised object.  This required a mouse hovering action to discover if the item was clickable.

    It's possible to do flat design right.

    QFT - Thanks for that.


  • Discourse touched me in a no-no place

    "How can you trust developers who hate the Ribbon interface, an interface that's vastly superior in every way to the pre-existing one, to improve the state-of-the-art?"

    Hey, shut up! People LIKE having to guess, for any given function, whether they should look for a toolbar button, a menu, or a task pane interface, all of which work differently and have different UI.



  •  @joe.edwards said:

    It's possible to do flat design right.

    I like this.  Thank you. 


  • Discourse touched me in a no-no place

    @blakeyrat said:

    Usually instead, Americans who need instant access to hot water will install one of those mini-water heaters in one of their sinks. Or Pyrex in the microwave is pretty fast.

    We've tried them at work. The mini water heaters don't make the water hot enough for tea (unlike coffee, the quality of brew of tea is critically dependent on keeping the temperature high; a steam line is good if you've got one, but that's rare in most modern workplaces) and microwaves tend to heat the water container's handles quite a bit.

    The stupid thing is I know how to make good tea despite not drinking it. Coffee is my favoured source of trimethylxanthine…
    @ender said:

    Schuko outlets and connectors are rated for 16A, and in modern installations they're often on their own breakers, allowing you to use the full potential if you wish so (however, unless the outlet is expected to be used for such high-powered machinery, it's not unusual to have a 10A breaker on it).
    UK plugs and sockets are designed for 13A draw (i.e., a 3kW load on a 230V circuit) and they're typically wired so that the circuit itself can have several such loads on it at once (ring mains have much less of a problem with voltage drop due to loads, at a penalty of being more expensive to install). The breaker is usually a 30A unit, and will also trip on a much lower earth current in any sane deployment. A 3kW kettle is fast at doing its only task (though 2kW models are much more common).

    High current draw devices (ovens, driers, larger AC units) need explicit wiring in, and may have their own dedicated breaker. I'd guess that would be the same anywhere else (that isn't a total death trap).


  • Discourse touched me in a no-no place

    @joe.edwards said:

    It's possible to do flat design right.
    Someone who knows what an affordance is and why it matters!



  • @dkf said:

    and microwaves tend to heat the water container's handles quite a bit.
    This shouldn't happen with proper microwave-safe cups.
    @dkf said:
    (ring mains have much less of a problem with voltage drop due to loads, at a penalty of being more expensive to install)
    I thought UK used ring circuits because they needed less copper, and were thus cheaper to install?
    @dkf said:
    The breaker is usually a 30A unit, and will also trip on a much lower earth current in any sane deployment.
    RCD? At least here, that's separate from the breaker (and if it trips, it'll cut power to everything).
    @dkf said:
    High current draw devices (ovens, driers, larger AC units) need explicit wiring in, and may have their own dedicated breaker.
    Here it depends - my washing machine and drier share the socket, and there hasn't been any problems. My oven (with glass-ceramic cooking plate) is wired in, and uses 3 phases, but my aunt's oven just has a normal schuko connector (but the cooking plate has it's own connector, and each outlet is on it's own breaker).


  • Discourse touched me in a no-no place

    @ender said:

    I thought UK used ring circuits because they needed less copper, and were thus cheaper to install?
    Cheaper than a 60A cable maybe, but not as cheap as what everyone else uses. (Why do we have this? An excessive use of 3kW electric heaters in the early 20th century is my guess, due to widespread poor home insulation in a relatively cool climate, and so the infrastructure to support them got codified.)

    Dedicated lighting circuits don't use ring mains. They're also set to trip for much lower currents.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    I'm waiting with bated breath to see your evidence that simplified computer interfaces makes people "dumber".
    The propensity that Windows has for making each new version of their OS look even more like a Fisher Price toy, perhaps?


Log in to reply