Scribbles on the New Mac Pro

Posted by mitch on January 26, 2014
hardware

A significant number of folks have asked about my thoughts on the new Mac Pro… so here we go. I promise not to tell you the same nonsense you have already read everywhere else (lighted ports, etc.).

Some background: I bought an 8-core 2008 Mac Pro on the day they were available for pre-order. It was my main workstation for years, until September 2012, when the speed and RAM ceiling became painful enough to upgrade to the “2012” Mac Pro, a 12 core 2.4 GHz machine. Clock for clock, that upgrade yielded compute performance roughly double the 2008 Mac Pro.

I wasn’t sure what to expect with that upgrade, nor was I sure what to expect with the new 2013 Mac Pro. Because of price, I elected to try a 6-core machine with the D500 video, 1 TB flash, and 64 GB of OWC RAM.

I recently ran some performance tests to see how things are going with the types of computing I do. One test is a unit test of some code I am writing. The code talks to several VMs on a third Dell VMware ESXi box and spends most of its time in select() loops. There was almost no performance difference between the old and new Macs–about 3%, which isn’t surprising.

However, I have some code that runs on local disk and does heavier CPU work. One of the pieces of code shoves a lot of data through a commercial database package inside of a VM. The VM is configured with 8 cores and 16 GB of RAM on both machines. We’ll call this Test A.

Another test does extensive CPU calculations on a multi-gigabyte dataset. The dataset is read once, computations are done and correlated. This runs on native hardware and not inside of a VM. We’ll call this Test B.

old Mac Pro1 new Mac Pro2 Retina 13″ MacBook Pro3
Test A: 65.6 seconds 38.1 seconds N/A (not enough RAM)
Test B: 82.3 seconds 52.9 seconds 67.8 seconds

1 2012 Mac Pro, 12-core 2.4 GHz, 64 GB of RAM, OWC PCIe flash
2 2013 Mac Pro, 6-core 3.5 GHz, 64 GB of RAM, Apple flash
3 2013 Retina MacBook Pro 13″, 2-core 3 GHz i7, 8 GB of RAM, Apple flash

As you can see, the new Mac does the same work in about 40% less time. The CPU work here is in the range of 1-3 cores; it doesn’t scale up to use all the available cores. To keep the tests as fair as possible, the old Mac Pro is booting from a 4-SSD RAID 0+1 and the test data lived on a OWC PCIe flash card. None of these utilize the GPUs of the old or new Macs in any fashion, nor is the code particularly optimized one way or the other. I ran the tests 3 times per machine and flushed the buffer caches before each run.

Does the Mac feel faster day to day? Maybe. In applications like Aperture, where I have 30,000 photos, scrolling and manipulation “seems” a heck of a lot better. (For reference, the old Mac has the Sapphire 3 GB 7950 Mac card. I don’t have an original Radeon 5770 to test with, having sold it.)

The cable mess behind the new Mac is the same as the old Mac. In fact, it’s really Apple’s active DVI adapters for my old Apple monitors that contribute to most of the cable mess. Once the Apple monitors start to die, that mess will go away, but until then I see little reason to upgrade.

The physical space of the new Mac pro is a significant advantage. The old Pro uses 4 sq ft of floor space w/ its external disk array. The new Pro by itself actually consumes a footprint smaller than a Mac Mini (see photo at end of this post)!

The fan is quiet, even under heavy CPU load. The top surface seems to range from 110 F — 130 F; the old Mac has a surface exhaust range from 95 — 99 F at the time I measured it. So it’s hotter to the touch, and indeed the sides of the chassis range from 91 F at the very bottom to about 96 F on average. For reference, the top of my closed Retina MacBook at the time I’m writing this is about 90 F and the metal surface of the 30″ Cinema display runs around 88 F to 90 F in my measurements (all measured with an IR non-contact thermometer).

Because there is no “front” of the new Mac Pro, you can turn it at any angle that reduces cable mess without feeling like you’ve got it out of alignment with, say, the edge of a desk. This turns out to be useful if you’re a bit particular about such things.

On storage expansion, there’s been a lot of concern about the lack of putting drives into the new Pro. Frankly, I ran my 2008 machine without any internal disks for years, instead using an Areca 1680x SAS RAID. I’m glad to see this change. There’s lots of consumer-level RAIDs out there under $1000, but I’ve given up on using them–performance is poor and integrity is often questionable.

I am backing up to a pair of 18 TB Thunderbolt Pegasus systems connected to a Mini in my basement, and bought an Areca ARC-8050 Thunderbolt RAID 8-Bay enclosure and put in 24 TB of disks for the new Pro. Sadly, while it’s fine in a closet or basement, it turns out to be too loud to sit on a desk, so I bit the bullet and ordered a 10 meter Thunderbolt cable. I haven’t received the cable yet, so I haven’t moved my data off my Areca SAS RAID in my old Pro yet. But once that is done, I expect to stop using the old 8 TB SAS RAID and just use the new RAID. These are expensive storage options, but the cheap stuff is even more expensive when it fails.

So, should you buy the new Mac Pro?

I don’t know.

For me, buying this Pro was never about upgrading from my old Pro, but rather upgrading my second workstation–a maxed out 2012 Mac Mini that struggled to drive 30″ displays and crashed regularly while doing so (it’s stable with smaller displays, but in the sample size of four or five Minis I’ve had over the years, none of them could reliably drive a 30″–Apple should really not pretend that they can). In the tests above, I’ve ignored the 900 MHz clock difference, but clearly that contributes to the performance for these kinds of tests.

What about price? This new Mac Pro ran me about $6100 with tax, shipping, and the OWC RAM upgrade. The old Mac Pro cost about $6300 for the system, PCIe flash, SSDs, brackets, video card upgrade, and OWC RAM upgrade. (The disk systems are essential to either Mac as a main workstation, but also about the same price as each other.) I don’t view the new Mac Pro as materially different in price. Pretty much every main workstation I’ve had in the last 12 yrs has run into the low five-figures. In the grand scheme of things, it’s still cheaper than, say, premium kitchen appliances, though perhaps it doesn’t last as long! On the other hand, I’m not good enough at cooking that my kitchen appliances are tools that enable income. If I wasn’t using my Macs to make money, I doubt I’d be buying such costly machines.

While I am not a video editor, and just do some 3d modeling for fun as part of furniture design or remodeling projects, I feel this machine is warranted for my use in heavy CPU work and/or a desire for a lot of monitors. I’m not in the target GPU-compute market (yet?), but I do want a big workspace. There’s no other Mac that offers this (I get headaches from the glossy displays Apple offers, though the smaller laptops screens are ok).

So now on my desk, I have a pair of Pros, each driving a set 3×30″ displays, which matches the work I am doing right now. I haven’t had a video lock up for 12 days and counting, which has proven a huge time saver and frustration reducer, so I’m happy that I jumped on this earlier than later.

Tags: , , ,

30 Years of Mac

Posted by mitch on January 24, 2014
hardware

My parents bought a Mac 128K in 1984 (pictured below). The screen stopped working in 1993, and it hadn’t been reliable at that point for a number of years–my dad upgraded to a pair of Mac Pluses when they came out and then later he upgraded again to the Mac II.

There were lots of frustrating things about the Mac 128. Almost no software worked on it, since it was outdated almost immediately with the Mac 512. MacWrite didn’t have a spell check or much of anything else. Only one program could run at a time–no Multi-Finder. A 1mb Mac Plus was a significantly better computer, especially if you had an external hard disk that conveniently fit under the Mac–thus increasing speed, storage capacity, and the height of the monitor. Even the headphone port on the 128 was mono, if I recall correctly.

Yet there was something deeply magical about computing in that era. I spent hours goofing off in MacDraw and MS Basic. At one point, my dad had the system “maxed out” with an Apple 300 baud modem, an external floppy drive, and the ImageWriter I printer. At some point, the modem went away and we were modemless for a number of years, but one day he brought home an extra 1200 baud modem he had at his office and I spent hours sorting out the Hayes AT command set to get it to work–a lot of registers had to be set on that modem; it wasn’t just a simple matter of ATDT555-1212.

That reminds me, I need to call Comcast. It seems that they cut their pricing on 100 Mbit connections.

Tags:

Moving AV Gear to the Basement

Posted by mitch on January 04, 2014
audio, home

When I bought my house in Boston, I gutted most of it and did extensive rewiring, including speaker wires in the living room. Recently, I had a large built-in cabinet/bookcase built for the living room and had to move some of those wires and outlets in preparation for it. Since the electricians had to come out anyway, I decided to move all my AV components into the basement. The goal was just to have the TV, speakers, and subwoofer in the living room.

There are now 5 drops down to the basement for the surround speakers. I soldered RCA keystone jacks onto one of the old speaker drops for the subwoofer–the only place I could find solderable keystone RCA jacks was, strangely enough, Radio Shack (for 57 cents each). Behind the TV, I had the electricians pull 8 new Cat6 drops and a single HDMI cable. I also had the electricians run two 15 amp dead runs that go into a 2-gang box and terminate in AC inlets (male connectors) so that the TV and sub in the living room are plugged into the same surge protection system as the basement, thus avoiding any ground loop issues, and also eliminating the need for surge protectors in the living room for this gear.

Four of the Cat6 drops terminate at the AV shelving. I planned to use 2 of these for serial and IR lines and 2 are held for spares in case of future video-over-Cat6 or other needs. The other four Cat6 lines run to the basement patch panel. Of course, some of these could also be patched back to the AV shelves if needed for uses other than Ethernet.

I’m using a cheap IR repeater from Amazon to control the components from my Harmony remote. This works fine with my Onkyo receiver, HDMI switch, Apple TV, and Roku. It doesn’t work with my Oppo bluray player–apparently there’s something different about the IR pulse Oppo uses, and I couldn’t figure out which general repeaters would work from various forum posts. Fortunately, Oppo sells their own IR repeater system for about $25, and I’ve modified it to run over Cat6 as well. This means I have two IR sensors hidden under the TV that plug into 1/8″ mono jacks in the wall using Leviton keystone modules.

The Playstation 4 and Wii use Bluetooth controllers, which work fine through the floor. Nothing fancy was needed to extend these. It turns out that the Wii sensor bar is an “IR flashlight”–the bar itself doesn’t send any data to the Wii. So I bought one with a USB connector on it so it can plug into any USB power supply. (The original Wii bar had weird 3-tooth screws and I didn’t want to tear it up.)

I also finally got around to building a 12v trigger solution for my amplifier–my 7 yr old Onkyo receiver doesn’t have a 12v trigger for the main zone, but a 10v wall wart plugged into the Onkyo does the trick, now that I’ve soldered a 1/8″ mono plug onto the end and plugged it into the Outlaw amp. (My front speakers are 4 ohm and the Onkyo would probably overheat trying to drive them.)

The final missing piece was a volume display. I missed knowing what the volume was on the receiver, the selected input, and the listening mode, so I built a simple serial device that plugs into the Onkyo’s serial port over Cat6 cables. I have a 20×2 large screen display that queries the Onkyo for status a few times a second (powered by Arduino–firmware code is here). Muting, powered off, volume, listening mode (e.g., THX, Stereo, Pure Audio…) are displayed, as well as the input source. My next step is to add a second serial interface to the display so that I can query the Oppo and show time into the disc, playing state, etc. (Many newer receivers support their serial protocols over Ethernet, albeit at a higher standby power usage, and as far as I can tell, Oppo has not opened up their Ethernet protocol, though their serial protocol is well documented.) The enclosure is a rather ugly, but works for the moment until I build something better:

Note that another option is just to buy a receiver/pre-amp that puts the volume out over HDMI. My receiver is older and leaves the HDMI signal unmolested. Most modern gear will just put the volume up on the screen, but my next processor is going to be a big purchase, and this was a lot cheaper for now.

I did make a few mistakes:

  • The quad coming off the inlets should have been a 4-gang (8 outlets).
  • I almost only had 4 Cat6 drops behind the entertainment center, mostly due to the length of Cat6 cable I had on hand. Happily my electrician went and bought another 1000 ft spool and said, “Mitch, what do you really want?”
  • I probably should have run a second HDMI cable, just in case I ever need it.
  • The 8 Cat6 cables, a coax line (in case I ever want another sub or need a coax line), and the HDMI cable all go into a 3-gang box in the living room. This is a bit tight for this many wires, especially when one of the Cat6 lines splits into two 1/8″ connectors.
  • Not really a mistake, but if you’re doing this and buying new shelving for the rack, buy shelves with wheels. I am just using an old shelf I already had, but wheels would be very handy.

If you have a small living room with a basement or closet nearby, this might be a good way to go if you don’t want to get rid of AV components. With more room to keep things organized, more air flow around the electronics, I’m really happy with how this turned out. Since the bluray player is in the basement, the DVD and blurays are now in the basement, and this has freed up ~50 linear feet of shelving upstairs. (I’ve ripped a lot of my movies, but it’s a pain and I haven’t done them all.)

And best of all, there is now a lot less crap in the living room.

Tags: , , , , ,

Why I Hate Computers

Posted by mitch on October 16, 2013
productivity

Sometimes the string never ends.

I was working on some code today; debugging a new set of functions I wrote this morning.

Off and on I’ve had issues with my development VM reaching one of the nodes, another VM, in my test environment. As I went to clear the state on that VM, the network stopped working.

I figured this was perhaps a bug in VMware Fusion 4, so I decided to upgrade to Fusion 6 Pro. I went to the VMware online store to buy it and got an error when trying to put the product into my shopping cart:

Error Number:  SIT_000002

And an error again when I tried again.

I logged into my account, which remembered that I had put Fusion 6 Pro into my shopping cart before. So I went to check out and got an error that the cart was empty.

So I tried adding it again and it worked.

Then I got an error when I put in my credit card number:

PMT_000011 : vmware, en_US, There is money to authorize, But no Aurthorize delegated were applicable

Then I found a free trial of Fusion 6 Pro and downloaded that and installed it on a test Mac Mini.

I then started trying to copy the test VM to the Mac Mini and observed a 11.5 MB/s transfer rate, which is suspiciously close to the maximum speed of 100baseT. But I have GigE. What’s going on? I checked previous network traffic stats on both machines–they had both done 70-90 MB/s activities in the last day.
Wondering if it was an AFP issue, I tried SMB and noticed the network throughput stayed at 11.5ish. Multiple streams didn’t help.

I finally found that the negotiated speed was indeed 100mbps on the Mac Pro for some reason. Forcing it to GigE caused the interface go achieve and lose carrier rapidly after a few minutes of working.

I tried to login to my switch and couldn’t remember the password, but I did eventually.

Then I wondered which port the Mac Pro was on.

After many minutes, I tracked the problem to a specific cable, not a switch port, wall port, or a port on the Mac Pro. I’m not sure why; the cable had been working fine for years.

In part of all this I discovered I have very few spare Cat-6 cables.

I logged into Monoprice to order more cables and almost got charged $58 for international shipping–I might not have noticed, except at checkout, they said they only would accept PayPal for international orders. Apparently, Monoprice had decided I lived in the UK since my last order.

Much teeth gnashing to fix my country with their store.

Order placed.

Started to write this blog post and the battery was inexplicably dead in the laptop I sat down with, had to get a charger.

And don’t even get me started on Time Machine issues today.

I still don’t know if the network will work in that VM or not. I am confident my code doesn’t work yet.

I don’t know how anyone uses a computer. They are way too complicated.

Tags:

The New Mac Pro

Posted by mitch on June 11, 2013
hardware

I am very excited about the new Mac Pro.

We don’t know the price yet. We don’t have full specifications. It’s not clear this form factor will ever support dual CPU packages or 8 DIMM slots (it seems it might only have 4 sockets). The total price for four 32 GB DIMMs currently runs about $10,000 from B&H. Happily, four 16 GB DIMMs is a lot less—around $1,200. 64 GB of RAM is sufficient for me for now, but I am looking to see a 128 GB option for around $1,200 within two years of owning the machine based on how my need for memory has grown in the past.

Apple does claim an I/O throughput on flash to be around 1250 MB/s, which is better than my RAID 1+0 four disk SATA SSD RAID in my Mac Pro and faster than my first-generation PCIe Accelsior OWC card.

Apple mentions up to 2×6 GB of dedicated video RAM, which significantly beats the 1-3 GB cards we’ve had on the market until now. I also am excited at the prospect of 30″ displays at 3840 x 2160. My three Apple 30″ displays are starting to show their age in terms of the backlight wear—it takes longer and longer for them to come to full brightness. I bought a Dell 30″ for my other desk, and I had to buy a calibrator to get acceptable color out of it. So I am hopeful Apple will ship a matte 30″ 4K display… (this seems rather unlikely).

Only four USB ports is a shame, but not the end of the world. Hopefully the USB 3 hub issues with Macs will be resolved soon.

And then there are the PCI slots. My Mac Pro currently has a 7950 video card in one slot, an Areca 1680x, an eSATA card that I quit using, and the PCIe Accelsior. Frankly, the new Mac Pro meets my PCI expansion needs—external chasses are cheap if I ever really need slots (just $980 for a 3 slot Magma; and Apple mentions expansion chasses are supported). What makes this possible is that Thunderbolt RAIDs are just as fast as Areca SAS configurations and generally require a lot less monkeying around. I have two Promise 18 TB Thunderbolt RAIDs connected to a Mac Mini in my basement for Time Machine backups and they have been fantastic.

So I imagine my 2013 Mac Pro will look like the following configuration:

  • Mac Pro with 8 or 12 cores, depending on price and clock options
  • 64 GB of RAM
  • 512 GB — 1 TB flash storage for boot
  • Thunderbolt ports 1-3 — with DisplayPort adapters for existing displays
  • Thunderbolt port 4 — 12-24 TB Thunderbolt RAID for home directory. I’d love to see a 12×2.5″ SSD RAID 1+0 when 1 TB SSDs get under the $400 price point.
  • 3 USB ports connected to hubs
  • 1 USB port connected to external hard disk for cloning boot drive to
  • Hopefully the audio out line has an optical connection like the AirPort Express and other optical products.

I think this will fit my needs pretty well, as long as a 128 GB RAM upgrade is cheap enough down the line. 256 GB would have been a lot nicer.

And best of all, this configuration will free up at least 4 sq ft of floor space where my Mac Pro and SAS chassis sit. If the computer is quiet enough to sit on the desk, then both the Mac Pro and the Thunderbolt RAID only take up about 1.5 sq ft of room, which would be a tremendous improvement in my office where space is a premium.

Update: I take issue with the complainers who say that the new Mac Pro will lead to a big cable mess. For me, I expect it will be about the same, but take up less floor space:

Tags: ,

Sennheiser RS220 vs Sennheiser RS180

Posted by mitch on May 13, 2013
audio


Almost two years ago, I wrote a post comparing the Sennheiser RS180 headphones to my really old Sony IR wireless headphones. It was an easy post to write; the RS 180s were the best thing happening for wireless headphones at the time, as far as I know.

In March 2012, Sennheiser released their first wireless headphones that they segment in their audiophile line-up. Until that time, the audiophile models were the HD 518, 558, 598, 600, 650, 800 (and now, the new 700 model comes in between the 650 and 800 at $1,000). The new RS 220 model has been out for a year and has received positive reviews by media publications, but horrible reviews on Amazon and in forum discussions due to serious signal drop out issues. I didn’t buy them for a while, fearing those issues were real.

But in a moment of frustration with the RS 180s, I took the plunge (and Amazon has a good return policy). The drop outs were in fact real and serious. Thanks to a post over on Head-fi.org, I learned one fellow had changed his wifi network to use Channel 11, which I did as well–and mostly that has solved the drop outs for me.

So if you can solve the signal drops, how are these headphones?

They are fantastic–These are the best wireless headphones on the market. Do they have the same sound quality as my HD 800 rig? No, but at $600, they are a quarter of the price of my HD 800 set-up, weigh less, and have no wire to the headphones. The main frustration I had with the RS 180s ($280–$320 street price) is that piano and classical music are quite muddy in them. The 180s seem to be better suited for watching TV and listening to modern pop music than anything with fine detail–and for what they are good at, they are great. But the RS 220s are much better, with the drawbacks of shorter range, less battery life, and the darned wireless signal issues.

For me, the trade off is worth it as long as the wireless issues remain infrequent. There’s a lot that goes on in the 2.4 GHz range–WiFi, Bluetooth, cordless mice, microwave ovens–so I remain a bit apprehensive about it. After listening to the 220s, I can say that the 180s experience signal drops as well–they are more subtle and less irritating. The RS 180 signal drop is like a record skipping vs the RS 220 that feels like an empty second or two lapse on a cell phone.

Physically, the headphones are much more comfortable than the 180s. The padding is thicker, the headband isn’t as “crushing”. Beware that the headphones are open, meaning they are not for private listening. The other perk of the 220s is that the base has audio output, which let me get rid of a switchbox to pick headphones or my M-audio BX5 D2 speakers on my desk. I use a Belkin remote-controlled power strip to turn the M-audios on and off, so this has simplified my desk a little bit. I also like that the RS 220 base is easy to turn on/off with one hand–the 180 base is very lightweight and the buttons require a firm push.

I am using my 220s with an AudioEngine D1 DAC ($180). It probably doesn’t do the 220s justice, but it’s small and has a volume control on it, which is nice. I don’t feel I have enough room on my desk for a something much larger. I have 2 ft AudioQuest cables connecting the DAC to the RS 220 base, which seems fine. The 220 base also has optical input, but I like having the volume control on the AudioEngine unit, so I intend to keep using it, rather than connect the computer’s optical out directly to the 220 base.

Tags: ,

My journey to Sanebox

Posted by mitch on March 31, 2013
productivity

(If you just want to read why Sanebox rocks, scroll down.)

Remember Xobni? Originally they were a plug-in for Outlook that did a few things–when a message was viewed in the normal Outlook interface, the Xobni plug-in would show other emails you had exchanged with the sender and it would show files that had been shared with the other person via attachments. The Xobni plug-in showed social information about the other person–profile pictures from LinkedIn, number of connections, and so on. And finally, the Xobni plug-in enabled better search for email than Outlook had.

Adam Smith, the founding CEO, gave a talk at MIT and for years, one of the things he said has been stuck in my head: “If you can improve email just a little bit, then you can create a lot of value.”

Mr. Smith was absolutely right. Email is incredibly bad for how it is used in most workflows today. In 2010, Mr. Smith left day-to-day activities with Xobni, and I’m afraid the company lost their way, becoming a contact tool rather than an email tool somewhere along the way. That didn’t work out well for Plaxo and I am not convinced it will work for Xobni. For a few years now, there hasn’t been any innovation in email that was interesting to me.

But in the last few months a few things came to light:

mailbox-iconMailbox, recently acquired by Dropbox. Mailbox built an incredible amount of hype, had a great video that looked interesting, got accused of some pump and dump action with TechCrunch, and ran up to over a million users quite rapidly. I lost interest in the app within minutes of using it. It was clear it was not going to scale to the amount of email I had in my inbox. One feature I did particularly like about Mailbox was the concept of putting an alarm on an email (e.g., a reminder to re-triage this message in a day).

Then there is Mailstrom. I am not really sure what Mailstrom does, but it sends me weekly reports of what’s going on in my inbox. There’s a tool you can use if you remember to go to their site and want to tediously triage stuff. I don’t want to do that. The web site talks about getting to Inbox zero, but I will never pull it off with what they offer. The report is kind of cool though:

mailstrom-report

Finally, there’s Sanebox. Sanebox analyzes the patterns of to/from in your inbox and sent mail and automatically moves emails to a folder called @SaneLater if it doesn’t believe you will be replying to it. So all bulk email ends up in @SaneLater. This has made dealing with email a ton easier. Sanebox also puts emails older than about 8 or 9 months into a folder called @SaneArchive. I went from 30,000 emails in my personal Inbox to just 1,700 in my Inbox and 1,800 in @SaneLater. It is now much easier to see which emails require replies.

Sanebox offers a free trial that runs about 2 weeks. Towards the end, they convert customers with a genius piece of messaging (paraphrased): “Hey, if you don’t pay us, we’ll move all your email back the way it was!” Brilliant–I bought it.

Email still sucks. But Sanebox has made it suck a bit less. And the best part is that there is no additional user interface to use.

Tags: , , , ,

Shoddy Engineering Practices

Posted by mitch on March 26, 2013
software

Do you find yourself saying things that are so absurd you’d not believe it if you hadn’t lived through it? I’ve had a few of those experiences.

I once worked at a software company that had a strong unit test culture. There were specifications for everything and tests for everything. Sounds great, right? Unfortunately, despite extensive review meetings, the specifications never survived the few first days of coding. The unit tests were detailed and thorough–fantastic! But no one ever ran the whole product in-house. QA didn’t. Developers didn’t. The company didn’t use its own product because “it is too complicated.” After I left the company, I had a problem that the company’s product was designed to solve. As a shareholder, I was interested in the company having customers, and happy to be one. I downloaded the product and installed it. When I fired it up, the product crashed about 3 lines into main(). No one had ever used the product. It’s great to prove that the pieces work individually, but your customers will be using them when they are all bolted together. You have to make sure the holes line up.

I also once worked at a company with the exact opposite approach. There were no unit tests of any kind. No libraries, no modules, nothing. The only way to test anything was to run the product. This meant there were very few crashes on the third line of main(), but it also made it impossible to know whether or not any of the pieces of the code work independently. If you can determine whether pieces work independently, you can track those results, tie them to check-ins, and suddenly understand a heck of a lot about what changes in your source control impacted what areas of the product. You can track down not only data corruption but also which modules hit performance issues, whether within a unit test context or within an integration context within a customer environment.

The first company had a hard time getting customers to be successful with the product. The second company had difficulty predicting when a release would be ready. The former is easier to fix–start using the product internally and also fix the QA organization to include system tests. The latter likely entails an expensive refactor and reflects deeper cultural issues of a hacking organization rather than an engineering organization. Neither extreme is healthy.

Tags: , , , ,

Getting Back Into It

Posted by mitch on March 10, 2013
home

Over the last 25 weeks, I’ve been trying hard to do very little beyond sleep and goof off. Even though I don’t have a job and have been considering myself on vacation, looking back at my calendar, I still managed to do over 100 business-related meetings and spent 4 weeks on the road. Nevertheless, I’ve been relaxing–I bought a Playstation Vita last March and didn’t start playing it until January.

I’m starting to get going on a few projects and as I fiddle around with possible next directions, and come to realize that I don’t have enough desk space to juggle completely different activities that hit creative road blocks. I was at a stand-still most of last week because my desk was covered with a project that wasn’t going anywhere. I needed another desk to dump it on so I can still look at it, but engage on another project while it percolates. Since I’m doing some other remodeling this year, I started looking at reclaiming my closet (which is just a junk room) to pick up an extra 35 sq ft of “spreading out” surface area.

I’m also finding that I have too little in the way of filing space–despite moving most “archive” files into banker boxes and shredding about 60 gallons of files, I still lack sufficient filing space. So this configuration adds a second linear file cabinet.

Some other remodel considerations for the office:

1. Build a large built-in bookcase at the front of the office. In this configuration, it will hold most of my office-related books. Barely–which would be a big improvement over the current situation. Below is a rendering that my architect created of the bookcase. I also plan to reclaim space under the eave to put the stereo without it taking up floor space in the room where it currently resides.

2. Remove the chimney. I should have done this 5 years ago but felt like I had feature-creeped on the first remodel too much as it was. To finish a software release, you have to stop adding crap to the release and get things fixed. The same applies to a remodel.
3. Finally install a split AC system. I got a quote for this years ago, but could never get the fellow to come out and do the work.

Sadly, remodeling in real life isn’t as fast as a few hours of monkeying around in SketchUp… so until then, piles of stuff it is!

If you came to this post hoping to read about what I’m working on and you weren’t happy to hear about goofing off, come back in 25 weeks!

Tags: ,

Stop Hating on Bose?

Posted by mitch on February 14, 2013
audio

Sennheiser PXC-350 (left) and Bose QC-15 (right).

For years I’ve been fascinated by the hate against Bose products. Bose must be really bad to get all the negative reviews, right? Search any electronics forum or Amazon reviews, and you’ll find thousands of people frothing about how much Bose sucks.

In December 2009, I wanted a cheap pair of computer speakers for my office in California. I didn’t need anything fancy and I didn’t want a subwoofer. I went to Fry’s and the only 2-speaker system they had for a reasonable price was the Bose Companion 2 speakers for $100. Sighing, I bought them.

They weren’t super awesome. In fact, they were pretty muddy. I gave them a negative review on Amazon. However, they were $100 and small. At this point they sit in the closet; I have a pair of M-Audio BX5 D2s on my desk, which take up significantly more room and sacrifice a lot of usability. They are plugged into a cheap AudioEngines DAC/amp, which means the whole system cost four times the Bose Companion 2s. (Update: After I posted this, I remembered that when I moved the Bose Companion 2’s to my office in Boston, they sounded a lot better–the acoustics in my California office were crap, I suppose.)

Fast-forward to the middle of 2011, I decided to get rid of my stereo separates in the bedroom. My cleaners were always moving the speakers, disconnecting the speakers, and the whole system took up a lot of room. With some reluctance, I bought a Bose Wave radio/CD player–I couldn’t find anything that I liked the looks of better than the Bose at any price point. There are competing products for less money, but they look like crap. I wanted something that looked good.

It’s an expensive box–$500 for a radio, CD player, amp, and speakers. If you listen to the Wave within 2 feet of the unit, it is indeed “bass-y” and “boom-y.” But if you listen to it across the room, it sounds great! I really love my Bose Wave system.

Back in 2009, since I “knew” that Bose sucked, I bought the Sennheiser PXC-350 headphones for air travel. The modern model is the Sennheiser PXC-450, which run $350 on Amazon as of this writing. Recently I had misplaced the Sennheiser headphones and I bought the Bose QuietComfort 15 headphones from the local Apple Store.

I could not believe how good the Bose QuietComforts are. I suspect they have a bit of a low-pass filter in them–they are not accurate as, say, a pair of Sennheiser HD 800s. But I don’t care about accuracy for noise-canceling headphones! I don’t want to hear engine noise, fan noise, or people talking when I am wearing these. Without sound playing, the Bose headphones are dead silent in my office with a bit of desk fan noise. The Sennheiser PXC-350s pass a bit of that noise through and introduce some hiss that is often an artifact of cheaper noise-canceling headphones.

The cord on the Sennheiser ‘phones is much nicer and has a volume control. The Bose came with two cords, one without controls and one with an Apple remote. The Apple remote works fine with my iPhone 4S, but with my current-generation iPod Nano, it introduces feedback noise that is unacceptable. That’s a serious issue, either with my iPod or the headphones.

However, armed with better silence, smaller size, and lighter weight, the Bose headphones are a clear winner.

So if you’ve been avoiding Bose because you’ve heard they suck, maybe take another look. If you’re looking for accurate listening, you’ll note that I said above none of these Bose products produce accurate sound to my ear. Personally, I don’t need accurate listening for my bedroom, riding the train, flying in a plane, or to hear that Skype is ringing.

Some photos:

The Bose headphones are quite a bit smaller.

Comparing the cup size. The Bose headphones are tighter on the ear, but not to the point it is uncomfortable.

7.0 oz vs 10.0 oz

Despite being a Sennheiser fan, I can say that the Bose QC 15s are quite a better buy for the typical noise-canceling applications.

Tags: , , , ,