csHokie
May 4, 06:32 PM
... plain, brown rapper.
Don't be racist... what is wrong with other color rappers?
Sorry, couldn't resist.
I for one will opt for the hard media unless I can download the image and burn it. I'd want to start with a completely fresh install.
Don't be racist... what is wrong with other color rappers?
Sorry, couldn't resist.
I for one will opt for the hard media unless I can download the image and burn it. I'd want to start with a completely fresh install.
iRun26.2
Apr 23, 09:58 PM
a retina display on the 13" MBP would be the one thing that would get me to upgrade almost immediately.
Your reaction is nearly identical to mine (although I am interested in seeing a Retina Display on the 11.4" MBA):
Double the pixel density on the 11.4" MBA screen, and I will pay $3k for that computer on the spot (even if I just upgraded to the Sandy Bridge version the week before). The stunning display on the iPhone 4 put them into a class unmatched by their rivals.
I can't wait...even if it still takes years to trickle down to the MBA. Someday all computer screens will have Retina Displays (and we will only see screens where the pixels are visible in a museum). Although I may be dead by then... :)
Your reaction is nearly identical to mine (although I am interested in seeing a Retina Display on the 11.4" MBA):
Double the pixel density on the 11.4" MBA screen, and I will pay $3k for that computer on the spot (even if I just upgraded to the Sandy Bridge version the week before). The stunning display on the iPhone 4 put them into a class unmatched by their rivals.
I can't wait...even if it still takes years to trickle down to the MBA. Someday all computer screens will have Retina Displays (and we will only see screens where the pixels are visible in a museum). Although I may be dead by then... :)
0815
May 4, 04:44 PM
What about Enterprise users?
The can use the not preferred option ... preferred does not mean its the one and only option - it means it is one of n options (n >=2)
The can use the not preferred option ... preferred does not mean its the one and only option - it means it is one of n options (n >=2)
CalBoy
May 3, 10:23 PM
The advantage you're talking about here is one of degrees. One may be slightly faster than the other, but it's not a revolutionary shift to a better system. I would compare this sort of change to a small upgrade in processing power. The advantages of the metric system over imperial run much deeper than that, so it's a poor analogy.
Can you cite reliable figures for the cost advantage versus the cost to switch?
Can you cite reliable figures for the cost advantage versus the cost to switch?
Moyank24
May 4, 01:51 PM
I'd think we'd want to explore this room.
Gotcha. just wanted exploring the hallway was one of the choices. I'm all for exploring the hallway.
Gotcha. just wanted exploring the hallway was one of the choices. I'm all for exploring the hallway.
Sweetfeld28
Nov 26, 07:32 PM
Like i stated in one of the other threads, this would be a great buy for Teachers, Artist, Photographers, or anyone else on the go. But, i think it would also be better if it was like IBM's tablet PC; one where you have be a laptop one minute, then a tablet the next minute.
damarsh
Mar 29, 02:18 PM
Can I just say I am amazed at some of the responses on this thread. Typical American and often I must admit British protectionism coming straight out like a bad smell. Without these so called "3rd world" workers Apple would be a lowly player. Firstly Japan is not "3rd world". It is one of the most developed countries in the world, and has some of the most adept and intelligent people on this planet. Secondly, the term "3rd world" and "1st world" is offensive. The proper term is developing and developed world. Thirdly, I am sure that we will all be fine if we dont get a few iPod batteries or glue. People have died over there and continue to die because of this tragedy. This is surely more important than a load of old microchips. Sorry. Rant over.
:cool::apple::(
:cool::apple::(
Full of Win
Apr 21, 05:08 PM
I think the next Mac Pro refresh will be a huge milestone. Not only will it be the first case redesign in nearly a decade and add all the latest tech (USB3, sata III, thunderbolt, etc) but I believe Apple will take this opportunity to finally revise the pricing structure. Over the past few years, Apple has been making a clear shift towards the consumer market. Part of that is arguably negative ("dumbing things down") but the positive is more reasonable prices. The Mac Pro is the only computer left that hasn't been revised. My hope is that Apple will create a few models of the new Mac Pro, at least one of which is an affordable mid-range consumer tower starting under the the $2,000 mark.
Unfortunately, they will probably wait to use the new performance desktop/server sandy bridge CPUs which Intel won't have ready until Q4 2011 (or later). If that's true then we won't see these new beauties until 1H 2012. :(
Basically, what many of us have been asking / begging Apple to do; release an iMac w/o the display and with removable hard drives.
Unfortunately, they will probably wait to use the new performance desktop/server sandy bridge CPUs which Intel won't have ready until Q4 2011 (or later). If that's true then we won't see these new beauties until 1H 2012. :(
Basically, what many of us have been asking / begging Apple to do; release an iMac w/o the display and with removable hard drives.
Glideslope
Mar 29, 08:53 PM
seismologist?
At a boy!!! :apple:
At a boy!!! :apple:
regandarcy
Mar 27, 06:52 AM
I'm all for cloud computing as an added feature....but not as a replacement for traditional storage of media and data.
I mean, I hope Apple doesn't force people to be connected to the cloud. I think that would be a mistake. Mainly because it would force you to either have access to a wifi signal, or pay for an expensive data plan just to gain access to your media.
As it is, all the telecom companies are dropping their unlimited plans and switching to tiered pricing. I think this creates a problem for the user to freely use their content without constant fear of exceeding their data plans.
And what of people with iPod touches or wifi only ipads...who are not within range of a wifi signal....and cannot access their content as a result. That would be very frustrating and limiting. It would make their devices nothing more than expensive paper weights.
It also creates a problem for those with 3G ipads or iPhones trying to access large video or media files in their cloud I think. I mean have you ever tried to watch a YouTube video over 3G? It SUCKS! So you'd be using up tons of bandwidth on a tiered data plan for crappy quality. How is that good?
And if the iPhone 5 is the first apple device to use 4G speeds....won't that eat up even more bandwidth? Running an even greater risk of you going over your limit and being charged outrageous fees by your service provider? Be it ATT or Verizon?
I understand that the concept of the cloud is freedom at it's core....the ability to have access to your media across multiple devices without having to store it on just one...but then you become a slave to the telecom companies and their tiered data plans...thus defeating that freedom.
Plus it forces you to chose a 3G iPad or put 3G into iPod touches to make it useful.
So I get it, and I don't get it.
The original concept of the iPod was to be able to carry all your music with you. Total freedom. And that's what helped make it such a huge success. Then came the iPhone and iPad. Both equally cool for music and video. You could store all your data on them and listen or watch them at your leisure on the go.
But if you then force people to store their data on a cloud...and pay for an expensive tiered data plan to access that data...to me it becomes not so free anymore. In fact, it becomes downright restrictive and suffocating IMHO.
As long as Apple doesn't abandon the ability to store your media ON your device, I'm cool with this move. The cloud would just become an added bonus which you could use or not use at your discretion.
I just think having to be connected to the cloud via wifi or 3G to access your data is kind of annoying....not to mention potentially EXPENSIVE!
Once in awhile...ok. But not as ones main means of access. I much rather have the bulk of my music and data actually stored ON my device. Much more convenient if you ask me.
Flash drives are big enough to carry most if not all the music and video you need. Why store it all on apple servers on some big farm in North Carolina that you need to be connected to wifi or an expensive tiered data plan just to access it? Don't see the point.
Is it just me? :-)
I mean, I hope Apple doesn't force people to be connected to the cloud. I think that would be a mistake. Mainly because it would force you to either have access to a wifi signal, or pay for an expensive data plan just to gain access to your media.
As it is, all the telecom companies are dropping their unlimited plans and switching to tiered pricing. I think this creates a problem for the user to freely use their content without constant fear of exceeding their data plans.
And what of people with iPod touches or wifi only ipads...who are not within range of a wifi signal....and cannot access their content as a result. That would be very frustrating and limiting. It would make their devices nothing more than expensive paper weights.
It also creates a problem for those with 3G ipads or iPhones trying to access large video or media files in their cloud I think. I mean have you ever tried to watch a YouTube video over 3G? It SUCKS! So you'd be using up tons of bandwidth on a tiered data plan for crappy quality. How is that good?
And if the iPhone 5 is the first apple device to use 4G speeds....won't that eat up even more bandwidth? Running an even greater risk of you going over your limit and being charged outrageous fees by your service provider? Be it ATT or Verizon?
I understand that the concept of the cloud is freedom at it's core....the ability to have access to your media across multiple devices without having to store it on just one...but then you become a slave to the telecom companies and their tiered data plans...thus defeating that freedom.
Plus it forces you to chose a 3G iPad or put 3G into iPod touches to make it useful.
So I get it, and I don't get it.
The original concept of the iPod was to be able to carry all your music with you. Total freedom. And that's what helped make it such a huge success. Then came the iPhone and iPad. Both equally cool for music and video. You could store all your data on them and listen or watch them at your leisure on the go.
But if you then force people to store their data on a cloud...and pay for an expensive tiered data plan to access that data...to me it becomes not so free anymore. In fact, it becomes downright restrictive and suffocating IMHO.
As long as Apple doesn't abandon the ability to store your media ON your device, I'm cool with this move. The cloud would just become an added bonus which you could use or not use at your discretion.
I just think having to be connected to the cloud via wifi or 3G to access your data is kind of annoying....not to mention potentially EXPENSIVE!
Once in awhile...ok. But not as ones main means of access. I much rather have the bulk of my music and data actually stored ON my device. Much more convenient if you ask me.
Flash drives are big enough to carry most if not all the music and video you need. Why store it all on apple servers on some big farm in North Carolina that you need to be connected to wifi or an expensive tiered data plan just to access it? Don't see the point.
Is it just me? :-)
dba7dba
Apr 26, 03:08 PM
add me to another purchaser of android phone. i myself have iphone. i wanted to buy a smartphone for a family member. considered iphone but one thing that drove me away from iphone was the requirement of a pc to activate it. no such requirement for android.
pika2000
Mar 27, 01:44 AM
Cloud based music and video streaming? LOL. Good luck doing that on 200MB cap with AT&T. :rolleyes:
This is why the iPhone will never be sold unlocked in the US. I bet Apple would be forced to do another deal with AT&T and/or Verizon to give them somekind of a break on the data plans so these could-based services will be actually usable.
This is why the iPhone will never be sold unlocked in the US. I bet Apple would be forced to do another deal with AT&T and/or Verizon to give them somekind of a break on the data plans so these could-based services will be actually usable.
AaronEdwards
Apr 26, 02:43 PM
And there's a huge difference between a 17" Macbook Pro and a 11" Macbook Air.
But they both get counted as laptops, don't they?
And what's your reasoning for why iPods don't get counted here? Because they don't have monthly contracts? How does that make sense? Should we only count iMac sales if they're hooked up to a monthly ISP or something?
Think about this.
People didn't argue that the iPod or the iPad should be counted until Android smart phones started to get really close and then overtake the iOS smart phones.
You can demand that they should be counted, but everyone will know the real reason for it.
But they both get counted as laptops, don't they?
And what's your reasoning for why iPods don't get counted here? Because they don't have monthly contracts? How does that make sense? Should we only count iMac sales if they're hooked up to a monthly ISP or something?
Think about this.
People didn't argue that the iPod or the iPad should be counted until Android smart phones started to get really close and then overtake the iOS smart phones.
You can demand that they should be counted, but everyone will know the real reason for it.
pkson
Apr 7, 09:55 PM
So you want Apple to be forced by the government to reduce its manufacturing, tell its customers "sorry, no iPad for you" because the competition needs to catch up? How stupid is that?:rolleyes:
Sadly, that's what happens in South Korea.
Which is the reason why the world has to put up with Samsung.
Sadly, that's what happens in South Korea.
Which is the reason why the world has to put up with Samsung.
MacFly123
Apr 22, 02:38 PM
Seriously? We also do full DVD high end hollywood type authoring at my facility (have been for 10+ Years) and Blu-Ray authoring and we have no need for internal optical super drives.
You guys seriously need to unhinge yourselves from those internal drives...lol :)
Why should I just have to buy another additional piece of hardware that is ugly and not integrated just to be able to do what my clients want?
Wait till the 2012 update then axe them forever! I don't care, but this year is a bit premature. The online delivery ecosystem still has a lot to work out! I am all for the future, but we are not quite there yet.
You guys seriously need to unhinge yourselves from those internal drives...lol :)
Why should I just have to buy another additional piece of hardware that is ugly and not integrated just to be able to do what my clients want?
Wait till the 2012 update then axe them forever! I don't care, but this year is a bit premature. The online delivery ecosystem still has a lot to work out! I am all for the future, but we are not quite there yet.
toddybody
Apr 24, 09:10 AM
Ps: Happy Easter everyone:)
iGary
Aug 7, 05:47 PM
http://www.blogsmithmedia.com/www.engadget.com/media/2006/08/dsc_0631.jpg
http://www.blogsmithmedia.com/www.engadget.com/media/2006/08/dsc_0641.jpg
http://www.blogsmithmedia.com/www.engadget.com/media/2006/08/dsc_0636.jpg
Kinda ugly.
http://www.blogsmithmedia.com/www.engadget.com/media/2006/08/dsc_0641.jpg
http://www.blogsmithmedia.com/www.engadget.com/media/2006/08/dsc_0636.jpg
Kinda ugly.
CalBoy
May 3, 03:39 PM
I see no reason why 99, 99.5, and 100 are easier to track than 37.2, 37.5, and 37.7. As you said, we accept body temp to be 98.6 and 37.0 in Celsius. If decimals are difficult to remember, then clearly we should pick the scale that represents normal body temp as an integer, right? ;)
It doesn't matter what normal body temperature is because that's not what people are looking for when they take a temperature; they're looking for what's not normal. If it can be helped, the number one is seeking should be as flat as possible.
There is a distinctive quality about 100 that is special. It represents an additional place value and is a line of demarcation for most people. For a scientist or professional, the numbers seem the same (each with 3 digits ending in the tenths place), but to the lay user they are very different. The average person doesn't know what significant digits are or when rounding is appropriate. It's far more likely that someone will falsely remember "37.2" as "37" than they will "99" as "98.6." Even if they do make an error and think of 98.6 as 99, it is an error on the side of caution (because presumably they will take their child to the doctor or at least call in).
I realize this makes me seem like I put people in low regard, but the fact is that most things designed for common use are meant to be idiot-proof. Redundancies and warnings are hard to miss in such designs, and on a temperature scale, one that makes 100 "dangerous" is very practical and effective. You have to keep in mind that this scale is going to be used by the illiterate, functionally illiterate, the negligent, the careless, the sloppy, and the hurried.
The importance of additional digits finds its way into many facets of life, including advertising and pricing. It essentially the only reason why everything is sold at intervals of "xx.99" instead of a flat price point. Marketers have long determined that if they were to round up to the nearest whole number, it would make the price seem disproportionately larger. The same "trick" is being used by the Fahrenheit scale; the presence of the additional digit makes people more alarmed at the appropriate time.
Perhaps your set of measuring cups is the additional piece of equipment. Indeed you wouldn't need them. For a recipe in SI, the only items you would need are an electronic balance, graduating measuring "cup," and a graduated cylinder. No series of cups or spoons required (although, they do of course come in metric for those so inclined).
Of course any amateur baker has at least a few cups of both wet and dry so they can keep ingredients separated but measured when they need to be added in a precise order. It just isn't practical to bake with 3 measuring devices and a scale (which, let's be real here, would cost 5 times as much as a set of measuring cups).
This also relies on having recipes with written weights as opposed to volumes. It would also be problematic because you'd make people relearn common measurements for the metric beaker because they couldn't have their cups (ie I know 1 egg is half a cup, so it's easy to put half an egg in a recipe-I would have to do milimeter devision to figure this out for a metric recipe even though there's a perfectly good standard device for it).
It might seem that way to you, but the majority of the world uses weight to measure dry ingredients. For them it's just as easy.
Sure when you have a commercial quantity (which is also how companies bake in bulk-by weight), but not when you're making a dozen muffins or cupcakes. The smaller the quantity, the worse off you are with weighing each ingredient in terms of efficiency.
Why would you need alternative names? A recipe would call for "30ml" of any given liquid. There's no need to call it anything else.
So what would you call 500ml of beer at a bar? Would everyone refer to the spoon at the dinner table as "the 30?" The naming convention isn't going to disappear just because measurements are given in metric. Or are you saying that the naming convention should disappear and numbers used exclusively in their stead?
Well, no one would ask for a 237ml vessel because that's an arbitrary number based on a different system of units. But if you wanted, yes, you could measure that amount in a graduated measuring cup (or weigh it on your balance).
In that case, what would I call 1 cup of a drink? Even if it is made flat at 200, 250, or 300ml, what would be the name? I think by and large it would still be called a cup. In that case you aren't really accomplishing much because people are going to refer to it as they will and the metric quantity wouldn't really do anything because it's not something that people usually divide or multiply by 10 very often in daily life.
I suspect people would call it a "quarter liter," much like I would say "quarter gallon."
No, that would be 1/4 of a liter, not 4 liters. I'm assuming that without gallons, the most closely analogous metric quantity would be 4 liters. What would be the marketing term for this? The shorthand name that would allow people to express a quantity without referring to another number?
And no, you wouldn't call 500ml a "pint" because, well, why would you? :confused:
Well I'm assuming that beer would have to be served in metric quantities, and a pint is known the world over as a beer. You can't really expect the name to go out of use just because the quantity has changed by a factor of about 25ml.
...But countries using SI do call 500ml a demi-liter ("demi" meaning "half").
Somehow I don't see that becoming popular pub lingo...
This is the case with Si units as well. 500, 250, 125, 75, etc. Though SI units can also be divided by any number you wish. Want to make 1/5 of the recipe? ...Just divide all the numbers by five.
Except you can't divide the servings people usually take for themselves very easily by 2, 4, 8, or 16. An eighth of 300ml (a hypothetical metric cup), for example, is a decimal. It's not very probable that if someone was to describe how much cream they added to their coffee they'd describe it as "37.5ml." It's more likely that they'll say "1/4 of x" or "2 of y." This is how the standard system was born; people took everyday quantities (often times as random as fists, feet, and gulps) and over time standardized them.
Every standard unit conforms to a value we are likely to see to this day (a man's foot is still about 12 inches, a tablespoon is about one bite, etc). Granted it's not scientific, but it's not meant to be. It's meant to be practical to describe everyday units, much like "lion" is not the full scientific name for panthera leo. One naming scheme makes sense for one application and another makes sense for a very different application. I whole heartedly agree that for scientific, industrial, and official uses metric is the way to go, but it is not the way to go for lay people. People are not scientists. They should use the measuring schemes that are practical for the things in their lives.
Not that OS X Panthera Leo doesn't have a nice ring to it, of course. ;)
No, but it is onerous for kids to learn SI units, which is a mandatory skill in this global world. Like I said, why teach kids two units of measure if one will suffice?
It's onerous to learn how to multiply and divide by 10 + 3 root words? :confused: Besides, so many things in our daily lives have both unit scales. My ruler has inches and cm and mm. Bathroom scales have pounds and kg. Even measuring cups have ml written on them.
You could be right for international commerce where values have to be recalculated just for the US, but like I said, I think those things should be converted. I don't really care if I buy a 25 gram candy bar as opposed to a 1 ounce candy bar or a 350ml can of soda.
Perhaps true, but just because you switch to metric, doesn't mean you need to stop using tablespoons and teaspoons for measurements. It's all an approximation anyway, since there are far more than 2 different spoon sizes, and many of them look like they're pretty much equal in size to a tablespoon.
I'm sorry, but which tablespoons do you use that aren't tablespoons? The measuring spoons most people have at home for baking are very precise and have the fractions clearly marked on them.
Other than that, there's a teaspoon, tablespoon, and serving spoon (which you wouldn't use as a measurement). The sizes are very different for each of those and I don't think anyone who saw them side by side could confuse them.
So if you're cooking, do what everyone else does with their spoons; if you need a tablespoon, grab the big-ish one and estimate. If you needed more precision than that, why wouldn't you use ml? :confused:
Because it's a heck of a lot easier to think, "I need one xspoon of secret ingredient" than it is to think, "I need xml of secret ingredient." You think like a scientist (because you are one). Most people aren't. That's who the teaspoons and tablespoons are for.
It doesn't matter what normal body temperature is because that's not what people are looking for when they take a temperature; they're looking for what's not normal. If it can be helped, the number one is seeking should be as flat as possible.
There is a distinctive quality about 100 that is special. It represents an additional place value and is a line of demarcation for most people. For a scientist or professional, the numbers seem the same (each with 3 digits ending in the tenths place), but to the lay user they are very different. The average person doesn't know what significant digits are or when rounding is appropriate. It's far more likely that someone will falsely remember "37.2" as "37" than they will "99" as "98.6." Even if they do make an error and think of 98.6 as 99, it is an error on the side of caution (because presumably they will take their child to the doctor or at least call in).
I realize this makes me seem like I put people in low regard, but the fact is that most things designed for common use are meant to be idiot-proof. Redundancies and warnings are hard to miss in such designs, and on a temperature scale, one that makes 100 "dangerous" is very practical and effective. You have to keep in mind that this scale is going to be used by the illiterate, functionally illiterate, the negligent, the careless, the sloppy, and the hurried.
The importance of additional digits finds its way into many facets of life, including advertising and pricing. It essentially the only reason why everything is sold at intervals of "xx.99" instead of a flat price point. Marketers have long determined that if they were to round up to the nearest whole number, it would make the price seem disproportionately larger. The same "trick" is being used by the Fahrenheit scale; the presence of the additional digit makes people more alarmed at the appropriate time.
Perhaps your set of measuring cups is the additional piece of equipment. Indeed you wouldn't need them. For a recipe in SI, the only items you would need are an electronic balance, graduating measuring "cup," and a graduated cylinder. No series of cups or spoons required (although, they do of course come in metric for those so inclined).
Of course any amateur baker has at least a few cups of both wet and dry so they can keep ingredients separated but measured when they need to be added in a precise order. It just isn't practical to bake with 3 measuring devices and a scale (which, let's be real here, would cost 5 times as much as a set of measuring cups).
This also relies on having recipes with written weights as opposed to volumes. It would also be problematic because you'd make people relearn common measurements for the metric beaker because they couldn't have their cups (ie I know 1 egg is half a cup, so it's easy to put half an egg in a recipe-I would have to do milimeter devision to figure this out for a metric recipe even though there's a perfectly good standard device for it).
It might seem that way to you, but the majority of the world uses weight to measure dry ingredients. For them it's just as easy.
Sure when you have a commercial quantity (which is also how companies bake in bulk-by weight), but not when you're making a dozen muffins or cupcakes. The smaller the quantity, the worse off you are with weighing each ingredient in terms of efficiency.
Why would you need alternative names? A recipe would call for "30ml" of any given liquid. There's no need to call it anything else.
So what would you call 500ml of beer at a bar? Would everyone refer to the spoon at the dinner table as "the 30?" The naming convention isn't going to disappear just because measurements are given in metric. Or are you saying that the naming convention should disappear and numbers used exclusively in their stead?
Well, no one would ask for a 237ml vessel because that's an arbitrary number based on a different system of units. But if you wanted, yes, you could measure that amount in a graduated measuring cup (or weigh it on your balance).
In that case, what would I call 1 cup of a drink? Even if it is made flat at 200, 250, or 300ml, what would be the name? I think by and large it would still be called a cup. In that case you aren't really accomplishing much because people are going to refer to it as they will and the metric quantity wouldn't really do anything because it's not something that people usually divide or multiply by 10 very often in daily life.
I suspect people would call it a "quarter liter," much like I would say "quarter gallon."
No, that would be 1/4 of a liter, not 4 liters. I'm assuming that without gallons, the most closely analogous metric quantity would be 4 liters. What would be the marketing term for this? The shorthand name that would allow people to express a quantity without referring to another number?
And no, you wouldn't call 500ml a "pint" because, well, why would you? :confused:
Well I'm assuming that beer would have to be served in metric quantities, and a pint is known the world over as a beer. You can't really expect the name to go out of use just because the quantity has changed by a factor of about 25ml.
...But countries using SI do call 500ml a demi-liter ("demi" meaning "half").
Somehow I don't see that becoming popular pub lingo...
This is the case with Si units as well. 500, 250, 125, 75, etc. Though SI units can also be divided by any number you wish. Want to make 1/5 of the recipe? ...Just divide all the numbers by five.
Except you can't divide the servings people usually take for themselves very easily by 2, 4, 8, or 16. An eighth of 300ml (a hypothetical metric cup), for example, is a decimal. It's not very probable that if someone was to describe how much cream they added to their coffee they'd describe it as "37.5ml." It's more likely that they'll say "1/4 of x" or "2 of y." This is how the standard system was born; people took everyday quantities (often times as random as fists, feet, and gulps) and over time standardized them.
Every standard unit conforms to a value we are likely to see to this day (a man's foot is still about 12 inches, a tablespoon is about one bite, etc). Granted it's not scientific, but it's not meant to be. It's meant to be practical to describe everyday units, much like "lion" is not the full scientific name for panthera leo. One naming scheme makes sense for one application and another makes sense for a very different application. I whole heartedly agree that for scientific, industrial, and official uses metric is the way to go, but it is not the way to go for lay people. People are not scientists. They should use the measuring schemes that are practical for the things in their lives.
Not that OS X Panthera Leo doesn't have a nice ring to it, of course. ;)
No, but it is onerous for kids to learn SI units, which is a mandatory skill in this global world. Like I said, why teach kids two units of measure if one will suffice?
It's onerous to learn how to multiply and divide by 10 + 3 root words? :confused: Besides, so many things in our daily lives have both unit scales. My ruler has inches and cm and mm. Bathroom scales have pounds and kg. Even measuring cups have ml written on them.
You could be right for international commerce where values have to be recalculated just for the US, but like I said, I think those things should be converted. I don't really care if I buy a 25 gram candy bar as opposed to a 1 ounce candy bar or a 350ml can of soda.
Perhaps true, but just because you switch to metric, doesn't mean you need to stop using tablespoons and teaspoons for measurements. It's all an approximation anyway, since there are far more than 2 different spoon sizes, and many of them look like they're pretty much equal in size to a tablespoon.
I'm sorry, but which tablespoons do you use that aren't tablespoons? The measuring spoons most people have at home for baking are very precise and have the fractions clearly marked on them.
Other than that, there's a teaspoon, tablespoon, and serving spoon (which you wouldn't use as a measurement). The sizes are very different for each of those and I don't think anyone who saw them side by side could confuse them.
So if you're cooking, do what everyone else does with their spoons; if you need a tablespoon, grab the big-ish one and estimate. If you needed more precision than that, why wouldn't you use ml? :confused:
Because it's a heck of a lot easier to think, "I need one xspoon of secret ingredient" than it is to think, "I need xml of secret ingredient." You think like a scientist (because you are one). Most people aren't. That's who the teaspoons and tablespoons are for.
teme
Aug 7, 03:45 PM
2. What applications do you need that a Mac Mini Core Duo can't handle? Oh, games? Why in the sweet baby Jesus' name are you on MACrumors if you're a gamer? Apple cedes your kind to Dellienware. Go. Shoo. Leave the grownups alone.
Here's other point of view: I want to use OSX in everyday use (Safari, Mail, iTunes, graphic design, Dreamweaver etc... and OSX overall). But sometimes I want to play games too, and it's awesome that nowadays it's possible to boot into Windows and play games there and then boot back into OSX. Are you saying that Apple should totally forget all users who would like to use OSX but occasionally play games on Windows, and let them buy PCs? Most of the gamers do not use their computer ONLY to play games. Consumer tower would be good for Apple to get new switchers and get more marketshare.
Here's other point of view: I want to use OSX in everyday use (Safari, Mail, iTunes, graphic design, Dreamweaver etc... and OSX overall). But sometimes I want to play games too, and it's awesome that nowadays it's possible to boot into Windows and play games there and then boot back into OSX. Are you saying that Apple should totally forget all users who would like to use OSX but occasionally play games on Windows, and let them buy PCs? Most of the gamers do not use their computer ONLY to play games. Consumer tower would be good for Apple to get new switchers and get more marketshare.
marvel2
Jan 11, 10:28 PM
Little problem with my TT car kit. My iPhone no longer automatically pairs with the car kit when I plug it in. I use to be able to turn BT on and plug it in the TT kit and it would pair in a few seconds. Now I have to manually pair the two by going into the BT settings on the iPhone.
Anyone else with this problem?
Anyone else with this problem?
kalsta
May 3, 09:41 PM
No, once again, it's not about comfort; it's about experience. I learned mostly SI units when I was in college, I'm quite comfortable with using those units - but the industry doesn't use those units. I learned, and became an expert in, the units used by the industry. You would ask millions of engineers, technicians, etc. to throw away years or even decades of experience simply to change a system that isn't broken.
Yes, it's a system that has its roots in the past, but the system still works. There's no compelling reason to change it. There's no efficiency to be gained.
When the Mac first came out, with it's GUI and mouse, it wasn't a runaway success, although to those in the know it was vastly superior to PCs running DOS. The arguments for staying with DOS were no doubt similar to yours… 'I spent years becoming an expert in DOS. I am comfortable with it. It works just fine. There is no need to change. Besides, it would be too costly to change.'
When you say there is 'no compelling reason to change', you're ignoring all the point already made. Base-10. Derived units. Consistent prefixes. This makes for much simpler calculations and formula in practice. It might be harder for an old fella like you to have to relearn things, but for the next generation of children learning from scratch, the metric system simplifies things so much. Not only that, but the USA is increasingly out of step with the rest of the world in this regard. So not only is this generation of Americans making it more difficult for future generations of Americans, but it's really complicating things for everyone in this age of global communication.
Okay, imagine for a moment that one of the US states wasn't using the decimal system for counting. Instead, they had a system where letters were used to designate certain amounts, similar to Roman numerals, but instead of having a base of 10, it varied. So perhaps A is equal to 12. Then three As is equal to B. Two Bs is equal to C. 22 Bs is equal to a D, and so on with this kind of inconsistency. You have a friend living in this state who claims that the system works just fine — he spent many years studying this system and even more using it in his line of work and can't see why he or anyone else in the state should have to learn this dangfangled decimal system. What would you say to your friend?
Yes, it's a system that has its roots in the past, but the system still works. There's no compelling reason to change it. There's no efficiency to be gained.
When the Mac first came out, with it's GUI and mouse, it wasn't a runaway success, although to those in the know it was vastly superior to PCs running DOS. The arguments for staying with DOS were no doubt similar to yours… 'I spent years becoming an expert in DOS. I am comfortable with it. It works just fine. There is no need to change. Besides, it would be too costly to change.'
When you say there is 'no compelling reason to change', you're ignoring all the point already made. Base-10. Derived units. Consistent prefixes. This makes for much simpler calculations and formula in practice. It might be harder for an old fella like you to have to relearn things, but for the next generation of children learning from scratch, the metric system simplifies things so much. Not only that, but the USA is increasingly out of step with the rest of the world in this regard. So not only is this generation of Americans making it more difficult for future generations of Americans, but it's really complicating things for everyone in this age of global communication.
Okay, imagine for a moment that one of the US states wasn't using the decimal system for counting. Instead, they had a system where letters were used to designate certain amounts, similar to Roman numerals, but instead of having a base of 10, it varied. So perhaps A is equal to 12. Then three As is equal to B. Two Bs is equal to C. 22 Bs is equal to a D, and so on with this kind of inconsistency. You have a friend living in this state who claims that the system works just fine — he spent many years studying this system and even more using it in his line of work and can't see why he or anyone else in the state should have to learn this dangfangled decimal system. What would you say to your friend?
realitymonkey
Mar 31, 08:14 AM
http://www.dailymail.co.uk/news/article-1053152/Apple-admit-Briton-DID-invent-iPod-hes-getting-money.html
Ah yes can we have a decent source please not that ridiculous piece of ill conceived drivel that is the Daily Mail.
Ah yes can we have a decent source please not that ridiculous piece of ill conceived drivel that is the Daily Mail.
Captain Planet
May 7, 01:08 PM
Oh man! That would be great... but I have a hard time seeing Apple do this. I'd be happy with like a "basic" version that'd be free... and for those who want the whole package, some sort of fee... but not $99 per year. Only time will tell I guess.
ThaDoggg
Mar 28, 09:47 AM
It's important that Apple starts to devote some serious time to it's operating systems as well. I don't see any major drawbacks to delaying any potential new hardware introductions.
0 comments:
Post a Comment