Cidu Bill on May 16th 2017


First of all, Huh?

Second, don’t dryers shut off after they buzz to tell you they’re done?

Filed in Arlo and Janis, Bill Bickel, CIDU, Jimmy Johnson, comic strips, comics, humor | 42 responses so far

42 Responses to “Buzzzz”

  1. Mona May 16th 2017 at 12:13 am 1

    Janice is rolling her eyes because Arlo thinks that “turning it off” is being a big help, when that is probably all he has ever done when it comes to laundry.
    Janice might have her dryer set to continue “fluffing” forever (until you turn it off) so that the clothes do not get wrinkled.

  2. Mona May 16th 2017 at 12:14 am 2

    (Continued from above….) And it would continue to buzz every so many minutes until you did shut it off (or open the door).

  3. Mona May 16th 2017 at 12:21 am 3

    And….Janice would probably prefer that Arlo put the movie on pause while she folds the towels and puts them away (which would probably take five minutes or less), but which would cause Arlo to grumble about because he hates waiting while looking at a paused screen. Or maybe that’s just me and Hubby. (Oh, no, did I say that out loud?)

  4. Meryl A May 16th 2017 at 02:03 am 4

    Mine buzzes once when done or at least it is suppose to. I am upstairs and use a timer to remember to go down.

    Mine is probably considered old - I bought it in July 1996 so newer ones may be different. (I know when as Robert’s dad was in the hospital before he died and it was the first large appliance I bought alone.)

  5. Kilby May 16th 2017 at 04:47 am 5

    Mona has it @1. And @2. And @3.

    P.S. Our dryer rolls everything over every few minutes for quite a while after it finishes the cycle. However, we never activate the buzzer feature, because it is so nerve-wrackingly loud that it’s better to have wrinkled clothes than to hear it even once.

  6. Olivier May 16th 2017 at 06:29 am 6

    At my parents’, the microwave beeps so loudly when it’s done that I stand next to it to stop it before the last second as elapsed. Plus, if you don’t come immediately, it beeps every ten seconds or so. It drives me crazy.

  7. Mitch4 May 16th 2017 at 07:18 am 7

    My first thought was that this was a case of divergent taste in movies, and the one they are watching was from Janis’s preferences. So Arlo doesn’t much mind missing some while he goes to deal with the dryer.

  8. Mitch4 May 16th 2017 at 07:22 am 8

    It was a family in-joke when an incandescent bulb blew out, or a fuse, to urge everyone (especially guests!) to raise their arms above their heads and waggle their hands. Because of course “Many hands make [the] light work.”

  9. Harley May 16th 2017 at 10:51 am 9

    Methinks the joke is the “Dish” Towels- also called “Hand” Towels.

    So, many “hands” make the work lighter!

  10. Bookworm May 16th 2017 at 12:23 pm 10

    Olivier #6, mine does the same thing. Apropos of nothing at all, I wish microwave manufacturers would program the display to count up so you could see how long it’s been sitting since it went off. Most of the time, I want to leave the food untouched for a minute or two to let the temperatures even out throughout the food. Seems like an easy feature to program.

  11. Cidu Bill May 16th 2017 at 03:16 pm 11

    Bookworm, it must be very easy, because the timer on my phone (and probably yours) does this: when the countdown is complete, it begins a “count-up” in a different color.

  12. James Pollock May 16th 2017 at 03:40 pm 12

    The processor in a phone is WAY more powerful than the processor in an appliance.

  13. chakolate May 16th 2017 at 03:45 pm 13

    I bought my stepmom her first computer when she was in her late 60s, and she was really eager for it. I bought her a really inexpensive one, and told her it was a perfectly good computer for her needs but it just didn’t have a lot of bells and whistles, like answering the phone for her and such.

    About a half hour later I was in the kitchen when the dryer went off, the timer she’d set for the washer went off, the oven timer went off, the dishwasher beeped, and the doorbell rang. She turned to me and said, “Thanks for the no-bells-and-whistles part”.


  14. Cidu Bill May 16th 2017 at 04:00 pm 14

    True, James, but I expect this particular feature is, at most, the 396,457th most important program in the phone, and requires a negligible amount of its memory and power.

  15. James Pollock May 16th 2017 at 04:44 pm 15

    The processor that was used in many early-generation home computers… Atari, Apple II, Commodore 64… was originally developed as an appliance controller. About 20 years’ worth of cars used a 386 controller for the emissions-control system.
    Today, the cheapest smartphone you can get has a processor that runs between 1000 and 2000 times faster than those old home computers, and over 100 times as fast as the embedded 386 in the car computers. (That same smartphone chip is probably running the dash of your car, if you have an electronic dash.)

    It’s not a small difference. This means that adding a new feature (no matter how trivial it seems) is probably way more expensive. There isn’t much margin in a microwave oven… using the cheapest-possible part in a pre-existing assembly is the way most of the manufacturers are going to go.

    But what do I know? My microwave is pre-digital, and still manages to make frozen things turn hot. I’ve had it since the mid-80’s, and I didn’t buy it new.

  16. Bookworm May 16th 2017 at 04:54 pm 16

    chakolate #13, I loved your story!

  17. Christine May 16th 2017 at 06:32 pm 17

    Bill - there is a huge problem these days with bloat for computer programmes (and smartphone apps). Just because something shouldn’t require a lot of processor power/memory doesn’t mean that it doesn’t use a lot. So the timer probably uses a lot more of the phone’s resources than you’d expect. Back when I took embedded systems (i.e. computers that control things like appliances), it was the only kind of programming where “best practices” about bloat and memory management still mattered. So even just running the microwave might be using a large amount of the onboard computer’s capacity.

    That said, I took that course a decade ago, and it was after that when I noticed a sharp shift from appliances having electronic controls, to appliances needing you to enter a programme every time you wanted to use them. So it might not be true anymore.

  18. Ted from Ft. Laud May 17th 2017 at 12:31 am 18

    The problem with adding that sort of functionality isn’t the processing power of the embedded processor - it is certainly adequate to count up. The issue is that for something like a microwave oven, parts costs are critical, so they use the cheapest device they can get away with, which is a microcontroller (very low end microprocessor targeting low-end - and cheap - embedded uses, which have the program storage and RAM and timers and pretty much any other facility needed included as part of the processor chip itself) . The main limitation for adding “simple” functionality is generally the storage for the programming - and it can be very limited. I’ve worked with some which supply 2k bytes of program storage and 256 bytes of RAM - and there are smaller ones. (More modern low-end ones may have more resources, but likely not - the goal is lowest cost, so the absolutely least resources that solve the job.) You really can’t fit that much program into 2048 (or fewer) bytes, and if the additional programming needed to add counting up and displaying the count after the timer expires adds 20 bytes of code, there may not be room.

  19. James Pollock May 17th 2017 at 12:47 am 19

    “The problem with adding that sort of functionality isn’t the processing power of the embedded processor - it is certainly adequate to count up.”

    That’s… not true. The appliance manufacturer is buying timers in finished silicon… they’d have to spec out, and test, a replacement timer system if they want to add features, and then they’d have to have their new custom silicon fabbed instead of using an off-the-shelf part.. Whereas the timers they can buy right now are tested and well-understood.

    And all that, for a feature that probably isn’t very important in the buying decision. I imaging the decision tree starts with “is it the right color” and goes through “does it look cool” before we even get down to anything about how effectively it actually cooks food. And, from the manufacturer’s point of view, “can I manufacture it for 5 cents cheaper is a significant concern… much less adding a couple of buck for custom silicon.

  20. Ted from Ft. Laud May 17th 2017 at 01:57 am 20

    Ah - no… The appliance manufacturer isn’t buying any separate timers or custom silicon - they are buying (almost certainly <$1 in their quantities) off the shelf microcontrollers with built-in EEPROM, RAM, timers, counters, I/O’s for handling the keypad, display, magnetron, and turntable, possibly (but probably not for this application) ADC and/or PWM, and whatever other “peripherals” are needed. Aside from the microcontroller, there may well be no digital parts (besides the display) in the entire unit. All the control of the microwave would be done in (very compact) software using the various peripherals embedded in the chip. Adding that “feature” would be a simple matter of programming using the existing facilities - the simplicity constrained by the need to fit the additional code in the part. If they needed more capability to add additional features, they would use a different (but pin compatible) chip from the same manufacturer, which would run exactly the same code, but might have more flash or RAM, or some additional peripherals. But while they might do that for fancy auto-cook features or something else they could charge extra for, they wouldn’t do that for this “count up” feature - as you note, it isn’t important enough to spend the extra few cents to get a more capable part if the required functionality fit without the feature but not with. Even if it fit in the part they would otherwise use, they might not put the feature in if the space was close - they’d probably want to save the space in case they needed to fix a problem (since removing even a minor released feature to free up needed space is much worse than not providing the feature in the first place).

  21. James Pollock May 17th 2017 at 02:47 am 21

    “The appliance manufacturer isn’t buying any separate timers or custom silicon”

    That was my point.

    It’s very likely that they’re buying the electronics as an assembly, built by someone else. Appliance manufacturing and semiconductor/electronics manufacturing are different things with different challenges, and doing both on the same line is just asking for trouble..

  22. Boise Ed May 17th 2017 at 03:36 am 22

    Sorry, James [15], but Apple ][ never used a 386. It had a 6502, which worked much differently than that Intel thing.

  23. James Pollock May 17th 2017 at 05:07 am 23

    “Sorry, James [15], but Apple ][ never used a 386.”
    I don’t know why you’re sorry, Ed, because nobody said Apple II’s had 386’s.
    I said Apple II’s had a chip that was designed as an appliance controller.
    Most of the Apple II line had a 6502, like the Atari and Commodore models of the day… a chip that was designed as an appliance controller. There were a couple of models of Apple II which used other chips… the Apple IIc used a 65C02, and the Apple IIGS used a 65816.

    The story of the 6502, and the company that designed and manufactured it, MOS Technologies, is an interesting one.

    You see that little dot after the word “controller”? It’s called a period, and it’s a sign that the first sentence has ended and a new one is about to begin. You seem to have missed the one between the sentence where I talked about Apple II’s and the sentence where I talked about 386’s.

  24. James Pollock May 17th 2017 at 11:34 am 24

    And, just because we (OK, I) was talking about the processing power in cars, this came out today:

  25. Brian in STL May 17th 2017 at 12:54 pm 25

    My kitchen timer/probe thermometer counts up when time has been reached.

  26. Ted from Ft. Laud May 17th 2017 at 03:35 pm 26

    Actually, it’s pretty unlikely that they are buying the electronics as an outside assembly, principally because the assembly would be so trivial (a few parts) - the cost of outsourcing that would almost certainly exceed the benefit (note that in my case, this is for a microwave oven that cost $50 - so by definition you can’t be talking about expensive electronics). And while semi manufacturing is a completely different world (and I never claimed they would make their own semi parts - as I said, they obviously buy those off the shelf), appliances and electronics have been married for a long time - I’ve had electronic control kitchen appliances for 20 years or more, and the complexity of the electro-mechanical and the control and sensor paths is far greater than that of the “electronic assembly”. (They are allowed to do this in 2 separate places in their manufacturing plant.)

    And none of that really matters anyway - whether Daewoo put the chip on the PC board themselves or paid someone else to do it, they are going to end up with the same (highly integrated) microcontroller chip, and with software they write themselves. And that still means that the only difference between the microwave oven that just beeps and stops when it counts down to zero, and one that continues counting up is software (and not much, but perhaps a little too much).

  27. Ted from Ft. Laud May 17th 2017 at 04:27 pm 27

    The 6501/6502 were not designed as appliance controllers. Although it isn’t completely clear what the initial intended market was, it was likely instrumentation and industrial control (same as the 6800), plus microcomputers (Commodore’s own interest). Their quantity price at launch and for a fair while after was (as intended) about $25 each (and this was 1976 dollars) - way too expensive for appliances. Chuck Peddle (the head of the development team) likely dreamed of them ultimately going into appliances and many other things, but instrumentation, industrial devices, and maybe automotive (plus the microcomputers) were (probably) realistic markets at the time they designed the chip.

    (And the 65C02 was effectively a somewhat bug fixed, slightly improved, software compatible (and pin compatible) version of a 6502. The 65816 was a somewhat/mostly upward compatible 16 bit version of the 65C02 (like the 8086 vs the 8080) that had a full 65C02 compatibility mode that the IIGS initially made use of - so at first, it was effectively just a faster 65C02.)

  28. Ted from Ft. Laud May 17th 2017 at 04:28 pm 28

    Oops - missed a close italics…

  29. Bookworm May 17th 2017 at 04:32 pm 29

    Thanks for the discussion! I was a software weenie and part-time programmer, but I didn’t know anything about appliance software. I appreciate everyone’s input.

    Brian #25, what brand is your timer? Maybe that’s what I need.

  30. Christine May 17th 2017 at 08:41 pm 30

    Ted from Ft. Laud - thank you for clarifying about the limitations of a chip. I’ve been out of the field for a while, and while I haven’t precisely forgotten the details, they take a bit longer than it’s worth to remember clearly.

  31. James Pollock May 17th 2017 at 09:26 pm 31

    “microcomputers (Commodore’s own interest).”

    Commodore didn’t own MOS Technology when the 6502 was designed, so it’s pretty unlikely that Commodore’s interests were taken into account.

    “it isn’t completely clear what the initial intended market was”
    The 6502’s intended market was, well, everything. The designers of the 6502 split off from Motorola at the frustration that Motorola wasn’t listening to its customers, and belief that they could produce a low-cost option ($25 as the targeted price point sounds pretty good compared to the competition… the 6800 processors targeted around $300, and Intel’s pricing was even higher for the 8080… back then, processors were a side business for Intel; their primary business was RAM.)

    “(And the 65C02 was effectively a somewhat bug fixed, slightly improved, software compatible (and pin compatible) version of a 6502. The 65816 was a somewhat/mostly upward compatible 16 bit version of the 65C02 (like the 8086 vs the 8080) ”
    Not at all. The 8080 family is incompatible with the 8086 family. (The 8080 was the dominant chip in personal computing before IBM entered the business. In fact, Apple II’s were upgradeable to an 8080 processor, and the Commodore 128 had one built-in, both so that these computers could run CP/M and the CP/M application library. The 8080 was successfully cloned by Zilog, and the Z80 chip was used in a lot of applications long after CP/M passed on. A lot of coin-op games used a Z80 as the sound controller.

  32. Boise Ed May 18th 2017 at 12:03 am 32

    James, I assumed that you intended those ellipses to substitute for dashes, and then it’s a natural assumption that your immediately subsequent mention of the 386 was continuing the reference to your list of computers. If you wish not to be misunderstood, then you should write more clearly.

  33. Dave in Boston May 18th 2017 at 12:44 am 33

    James, you are wrong. Intel’s wrongheaded compatibility notions go way back. The 8080 and 8086 weren’t directly compatible, but you could run unmodified 8080 assembler source through an 8086 assembler and get a working 8086 binary out.

    Also, I think you’ll find the C128 had a Z80 in it, not an 8080, and also that the usual CP/M relied on Z80 features and wouldn’t run on a vintage 8080, but I’m less sure about that.

  34. James Pollock May 18th 2017 at 01:37 am 34

    “If you wish not to be misunderstood, then you should write more clearly.”
    I’m sorry you weren’t previously familiar with the English convention of ending a sentence with a period. Not sure why you think this is an error on my part.

    “Also, I think you’ll find the C128 had a Z80 in it”
    Yeah. A Z80 is a cheaper, second-sourced 8080. I’ve made several references to this.

    “and also that the usual CP/M relied on Z80 features and wouldn’t run on a vintage 8080″

  35. Kilby May 18th 2017 at 04:51 am 35

    Cars used the 80186 chip for a very long time, starting in the 80’s. This was part of the reason that the PCjr failed: the original design was supposed to use the 80186, but Microsoft had (incorrectly) adopted a series of reserved interrupts in the software for DOS and/or Basic. Microsoft refused to modify its software, and Intel was not about to modify the 80186 to use different interrupts, since car manufacturers were dependent on the existing design. So IBM was forced to go back to the 8088 for the PCjr, leaving the system with far less processing power than they had originally planned.

  36. Dave in Boston May 18th 2017 at 11:05 am 36

    Even Wikipedia will tell you that the Z80 was an extension of the 8080, James. But whatever, you’ve repeatedly demonstrated yourself incapable of admitting you were wrong…

  37. James Pollock May 18th 2017 at 11:25 am 37

    “Even Wikipedia will tell you that the Z80 was an extension of the 8080, James.”
    Right. Now… how was it used? As a cheaper replacement for devices that needed to run 8080 code.

    “you’ve repeatedly demonstrated yourself incapable of admitting you were wrong”
    I think you’ve confused your repeatedly being incapable of showing I was wrong for something else.

  38. Ted from Ft. Laud May 18th 2017 at 12:59 pm 38

    It is true that Commodore didn’t own MOS at the time the 6501/2 were designed, but they were MOS’s largest customer at the time, and both companies were in trouble (due to TI entering the calculator market themselves) at the time that the Moto team was hired. I’ve heard both that MOS (unsolicited) wanted to get into microprocessors as a new market after their calculator chips lost market (and Commodore came to them for microprocessors after the fact) and that Jack Tramiel went to them for (cheap) microprocessors so Commodore could get into the microcomputer business to replace the lost calculator business and that is why MOS hired the Moto team. I haven’t found which story is closest to true, but Commodore was certainly at major factor at MOS well before they bought them, so I’m sure their interests were considered.

    The Z80 was not a second-sourced (or cloned) 8080. While it could run the same code (as a subset of its native ISA), it had a completely different pinout and was electrically incompatible, so boards had to be designed specifically for it. It likely was less expensive (Intel never having been known for its pricing), but it got major design wins more because it was much faster (even without taking advantage of the ISA improvements) and (for a variety of reasons) was easier to design in, especially for embedded uses. The Z80 (and 6502) were dominant in the “personal computer” market before the IBM PC, having displaced the 8080 pretty quickly after they came to market.

    You are correct that CP/M (as shipped by DR) used only the 8080 ISA and not any of the Z80 enhancements. However, Dave is correct that the 8086 was an assembly source superset of the 8080 (not binary compatible, but that is unimportant for embedded uses - you want to preserve your developed code; you have no interest in reusing your binaries). Similarly, the 65816 was a source level superset of the 65C02 (as well as having a compatibility mode that allowed it to run as a 6502 at the binary level).

    As far as ever getting you to admit that you might be mistaken about anything, well - I think we learned a long time ago that wasn’t going to happen…

  39. James Pollock May 18th 2017 at 02:33 pm 39

    ” The Z80 (and 6502) were dominant in the “personal computer” market before the IBM PC, having displaced the 8080 pretty quickly after they came to market.”

    Mass-market machines (the Atari 2600, 400, and 800, The Apple I and II, the Commodore PET) used 6502 because they were cheap. CP/M, the dominant “business” system at the time, was built for the Intel 8080, and there were dozens of manufacturers using it. There’s a REASON the Z80 was designed to run 8080 code natively. Then IBM showed up and put its giant, monster foot down and 8086 took over. Yes, there were still 6502 (and 6502 descendants, like the 6510) in products that sold quite well. Yes, Motorola 68000 (and its descendants) showed up and were used in some systems… one of which survives to this day… now using Intel x86 chips.
    In embedded systems, the Z80 continued to find use long after its day as a computer CPU had faded into memory. But in computers, with one substantial exception, the Z80 was used almost exclusively to run CP/M 2.2. Which was designed for the 8080. The Z80 computers that didn’t run CP/M, again with that one exception, were pretty much non-starters in the states (Timex/Sinclair, for example).

    Zilog sold better 8080’s than Intel’s. NEC sold better 8088s and 8086’s than Intel’s. AMD sold better 386’s than Intel’s. What do all of those have in common? Intel was moving on to newer, better, higher-margin chips. The secret of Intel’s success isn’t that they have better engineers (though their engineers are pretty good), nor better marketing budgets (though the Intel marketing department is pretty good, too.) Intel’s advantage is their manufacturing process engineers, which let them build parts at a lower cost than their competitors. Intel could sell… at a profit!… at a price point that was below AMD’s costs. Cyrix couldn’t manufacture at all.

  40. Wendy May 18th 2017 at 05:48 pm 40

    Not to restart an argument, but James, I have to agree with Boise Ed, your statement in 15 was unclear. You wrote:
    “The processor that was used in many early-generation home computers… Atari, Apple II, Commodore 64… was originally developed as an appliance controller. About 20 years’ worth of cars used a 386 controller for the emissions-control system.” and then started another paragraph.
    The way this is written, it is easy to assume that the two sentences are referring to the same processor chip. (Yes, we know it is not true, but that’s not clear from the way you wrote it, and younger people might have no clue about these facts.) You needed to mention a different manufacturer in the second sentence to make it clear that you were not saying that the 386 was the chip that was in the 3 computers mentioned in sentence one, or use the name of the processor that was in them in sentence one.

    And, while I admire your range of knowledge, it is a bit irritating that you nitpick others’ comments, but take offense when someone criticizes or misunderstands one of yours.

  41. Meryl A May 23rd 2017 at 02:33 am 41

    And we have the assortment in the basement and the Teddy bear’s room - Atari 800, Commodore 128, Epson 286, some 386, Pentium… (Yes,we jumped over the 486)

    chakolate - My mom used a desktop computer at work, but in her late 50’s (she is now 88) she started getting hand me down computers - my BIL gave her one (he works in IT). I gave her my old one - which was husband’s old one. (Whenever husband or I needed a new computer he got one and I got the old one as he uses graphics and needs the faster computer.) Getting the computer ready for her was the first time I was using multiple computers at once. I had the computer for her running while I had mine running to figure out which of the freeware programs to put in hers and get them out of my downloads folder. Robert walked in, looked and shook his head. These days he does that if I am running both laptops and the desktop at once.

    Now, if only my Blackberry had not just stopped working with the Internet - telephone calls, text messages still work, Internet by wifi still works, but… Spent an hour and a half on chat with provider, now waiting to hear from Blackberry. Big decision if it cannot be fixed - cut down service to almost no data as can’t use and save $15 a month off the $45 price or get a new phone - I would go with the cut the data out, sure Robert will tell me I have to have data.

  42. Brian in STL May 23rd 2017 at 01:58 pm 42

    “Brian #25, what brand is your timer? Maybe that’s what I need.”

    Coming back late to this, but mine is a Taylor. It looks a lot like this one:

Comments RSS

Leave a Reply