Running legacy Win-only software on Mac OS

One of the headaches of moving to Mac has been that I have one piece of statistical software that is Windows only. I could abandon the software and learn R, but I didn’t want to do anything rash. :) Besides, at the Midwest, Fred assured me that he was able to get everything to run on his Mac. Armed with optimism and patience, I figured it out.

My goal: To run HLM 6 + Stata 10 on my MBP without having a dual boot or full out windows emulator.

Random aside: I use these together because Stata will just not estimate complex models with random effects / coefficients. I can get a simple model to run in a few minutes in Stata, but I’ve left a model with 2-3 cross-level interactions run overnight with no convergence. Using HLM 6 + Stata 10 (+ the hlm .ado file), I can use Stata to pass the model to HLM, which then passes the results back to Stata for post-estimation export/analysis.

The process: Install Crossover.

Install both HLM 6 and Stata 10 into the same Vista Bottle. [Maybe I could have put them in an XP bottle, but to be safe I went with Vista.] It took me a bit of reading to figure out that they need to be in the same bottle if they are going to interact with each other. Because neither HLM nor Stata are supported, you have to install them as “unsupported” apps. I used the HLM 6.04 cd-rom to install HLM with no problem, and then I used the old upgrade patch to 6.08 to upgrade the version just fine. Stata 10 was a bit more of a pain. I have a cd-rom and a perpetual license. However, the cd-rom has both Windows and Mac versions of the install, and my Mac wouldn’t recognize or mount the Windows files. The Crossover help files don’t really explain what to do if this is the case, and the various instructions that I found online to mount a Windows cd-rom from the Mac terminal didn’t work for me. Instead, I fired up my old pc and copied the install cd-rom to my NAS. [An important lesson here about switching from PC to Mac... copy any old software cd's you might need/want to an external drive before you burn your PC. It will save you headaches later.]

Once they are both installed, you need to follow the instructions to enable HLM to talk with Stata 10. In this setup, this means that I needed to add the following line:

"PATH"="C:/Program Files/HLM6/WHLM.exe"

to the very end of the file located at

/Users/username/Library/Application Support/CrossOver/Bottles/HLM6/cxbottle.conf

I got it to work once, but since then have had issues getting it to work with new files due to versioning of the stata .dta file.

Posted in Uncategorized | Comments closed

Moving on up….

… to Maclandia. We’ve replaced our Linux and Windows pcs with Macbooks. The world is a scary but shiny new place. We also upgraded our NAS to 4 TB. It’s taken me a couple of weekends, but I’ve finally gotten all my files moved, various tools installed, and a full range of backup provisions configured. I’m mainly putting everything down here so that I can remember all the steps for setting up B’s tools and backups and also in the event of harddrive armageddon.

First lesson

Finder doesn’t play nice with my old workflow. In the past, I would sometimes put a working folder in my Dropbox, and then when the project was finished, copy it back into my home working directory. Windows had a nice way of merging or updating the changed files, prompting me if a file already existed. In Mac OS, I learned quite quickly that if you copy a folder over an existing folder of the same name, it will delete the old contents and replace all the files with the new contents. As a result, I will need to become proficient with rsync and the terminal to do this merging effectively. [NOTE to self: Figure out the correct options/commands and post here.]

Setting up a new backup or redundancy plan

When I lived in PClandia, I never backed up my entire OS, based on the philosophy that if something happened to my harddrive and/or OS, I would want to reinstall my entire OS anyway (esp. since Windows tends to get bogged down over time with installs and uninstalls, registry edits, etc.). My old strategy was to back up an exact copy of my user folder (within which all my user profiles, data files, and copies of downloaded software, etc.) to our NAS (RAID 1, 2TB) drive, and then have automatic backups run to a 3TB USB connected drive nightly. I’m still not sure about all the backup lingo, but I essentially had it set up so that the NAS was a mirror copy (including deletions of my pc) but the USB backup would not delete any files but add new copies/modifications to the drive, renaming the old versions of files that had been updated.

A couple of things prompted me to rethink this strategy now that I live in Maclandia. First, there’s Time Machine. I understand it can be buggy, so I don’t want to rely on it alone. At the same time, if there were a hardware failure, it would be nice to be able to just reload everything to a new machine (and since Mac OS doesn’t have the bogged down with time/installs problem of Windows – or so I’m told – then I might not want to reinstall everything). This means I wanted to set up TM on the NAS, but also have my data backed up the old way, too. I want my home directory backed up separately in the event that my OS gets corrupted (again, remember, I’ve lived in PClandia for a long time), I can easily reload all my data to a new OS install.

My strategy
I created a user share on my NAS for my TM backups based on these instructions. I had initially used the built in TM features of the NAS, but got a couple of errors (e.g., the TM had problems ‘verifying’ the backup and then would start a new one) when we were trying to back up 2 MBPs to the same TM share. Rather than try to troubleshoot, I figured it might be safer just to keep them in separate NAS shares anyway. [Downside: In my version of NAS firmware, there's not an option in the user interface to apply a quota to these user shares, though I can see a user quota file in the admin directory that I can probably edit to apply a quota. So, that's a to-do item.] These TM backups won’t be backed up, since they are the backup.

Then, I set up rsync + Automator to backup my user home folder to another NAS share nightly. Right now, I followed these simple instructions. The downside of the simplicity is that it doesn’t create a log file or email me the status of the backup, or errors. It looks like these more complicated instructions would do that, but I think I’m going to see how the simple version works for a bit before tinkering with it. I’m also still using the NAS’s backup tools to do an incremental backup to the USB drive like before.

Off-site backups and access to files from iPad

In addition, I want to save my work files offsite and also access them from the iPad. I have two options here: my backups user on Dreamhost (50GB) and/or Dropbox (free 5GB or paid 100GB). This got a little complicated because I also wanted to integrate Zotero (on MBP) and iAnnotate (on the iPad) into my workflow.

Zotero will only allow you to sync your library using either their service (not enough space) or via WebDAV. My Dreamhost backups storage doesn’t support webDAV. I tinkered around with some options (e.g., storagemadeeasy) to turn either my Dreamhost or my NAS shares into WebDAV folders. But, then, I learned that Mac OS doesn’t play well (or at least easily) with WebDAV, so I abandoned that strategy. Also, though my NAS has an iPad app and related tools to be able to access files remotely (via an encrypted tunnel and using a myCloud app), in practice, they haven’t worked seamlessly or at all.

At every turn, it seems that Dropbox was the only solid solution. Since I’m tired of moving files around to not fill up my free space, I’ve resigned myself to paying for Dropbox storage. Seems silly because I have lots of storage at home and 50GB at Dreamhost and I can get them to work together with lots of ad hoc tools and arrangements, but in this instance, I decided that convenience and reliability was more important than cost. Also, I’m really good at rationalizing money spent on technology.

I set up Zotero with my data directory in my home folder, but my document folder in my Dropbox. My group library files are synced using the Zotero service. I also have copied these to my personal library as separate collections. I installed the ZotFile plugin to manage the documents. It took some back and forth and tweaking to get the settings right, so I’m going to put them all here for future reference.






[In the process of researching how to do this, I consulted these sources, which all had some part of the necessary information. [I did try to get things set up with Zotpad, but it kept crashing and wouldn’t work. Maybe someday there will be an updated, stable version.]

For now, the Zotero + iAnnotate workflow will do what I want (though if Zotpad worked, it would be closer to the ideal). The goal is to keep a library of original .pdfs and also copies labeled _md.pdf that have been annotated. The tricky part is making sure both versions are linked to the Zotero database, which is what the Zotfile “send to tablet” features do within Zotero. [Zotpad would do this at the tablet, rather than in Zotero, but since it doesn't work, it's not really an option.]

Essentially, when I add .pdfs to entries in Zotero, I also need to send them to the ipad folder (highlight all items, right click, manage attachments, send to tablet). This makes a second copy of the file in the ipad subfolder within my Zotero document folder inside Dropbox. Then to read the file on my ipad, I open iAnnotate, use the Dropbox connection to browse to the ipad folder and read/annotate the file. When, I’m done, I can save the file back to Dropbox. Then, next time I want to write/cite, I open Zotero. Zotfile creates to special search folders that will let you see if you have any tablet files that need to be reconnected (Tablet files modified) to their parent items in the Zotero database. So, before writing, I open that search folder to get the annotated files from the tablet back into the database and main library folder (highlight all items, right click, get from tablet). The result is one library with both my original .pdf plus the _md.pdf annotated .pdf. Zotfile also copies all the annotations into the Zotero notes field, too, which is nice. Also, informally, I can treat the ipad folder as a to-read folder, for those times when I’m sitting around bored with my ipad and nothing to do. [<- Ha! That's what email is for. But, really, I'm never sitting around bored, and I never have nothing to do.] It sounds a bit clunky, but actually isn't too bad if you just remember to send all new items to your tablet when you add them and then re-get them from the tablet the next time you open Zotero (hence the need for the full 100GB in Dropbox.... though you could set it up so just the ipad folder is in your Dropbox folder). The downside is that this workflow assumes a) you're adding items to your library from your pc (not tablet) and b) you always send new .pdfs to your tablet from Zotero+Zotfile standalone. [Again, if Zotpad worked, you could use it to do the linking of files to database tablet-side rather than pc-side, which would add the flexibility to add/edit Zotero library items on the ipad.] One more off-site backup, just because I’m paranoid

My home folder is synced to my NAS, and then backed up to an external USB. My Dropbox folder includes all my work files, including .pdf library, and so everything is also synced to Dropbox. [My Dropbox folder also includes some other shared folders with Brian and others that are not part of my work directory.] Because my work files are the most important, I’ve also set them to back up (less my .pdf library) to the Dreamhost backups user using rsync and automator. If I worked at a University that provided secure data backups for faculty (ahem!), I could have tweaked these instructions to use that space. [If my university provided secure data backups + WebDav, I probably could avoid Dropbox altogether. (sigh)] Here’s the instructions I followed to backup just my work folder to Dreamhost, with a variation on the iCal/Automator step at the end:

Posted in Uncategorized | Comments closed

Christmas stories

I’m sure every family has a collection of stories they share around Christmas. One of the favorites in my mom’s family is about the time, when my mom was in elementary school, her dad told her they weren’t having Christmas so that she would stop bugging them about her presents. He told her they couldn’t afford Christmas that year.

My mom told her teacher that they weren’t having Christmas because they didn’t have any money. So, my mom was allowed to drag the classroom tree home, and apparently it was a sad, pitiful thing. Grandma and Grandpa didn’t understand why the teacher gave the tree to my mom.

They didn’t understand later, either, when various church and charity groups from around Flint brought baskets of food and gifts to their house. And, despite my Grandma’s protestations that they didn’t need charity, everyone insisted they accept the goods because everyone deserves to have Christmas.

Eventually, they figured out what had happened.

Posted in Personal | Leave a comment

Customizing Emacs

I figure one way to become more familiar with Emacs is to customize some of its settings. Of the various pages I’ve found on the web that describe different customizations, I’ve found these two to be the most useful.

First, I wandered around the Emacs customization menu by clicking the Customize Setup link on the original home screen. There were so many options there that it was a bit overwhelming, even if you browse using the M-x customize-browse command. I did use the menu to “Save for future sessions,” which created a virtually empty .emacs file for me. I figured that way the file would be saved in the correct place (i.e., my home directory).

Next, I thought I’d browse the options for changing the color scheme. These instructions were the most helpful for installation. Once installed, I typed M-x color-theme-select and then clicked/RET on the different themes until I found one I liked. Not too complicated. Seems you can customize your own theme using this tool, too.

I poked around online and found several examples of .emacs.el or init.el files. (Seems like it’s a “thing” to post your Emacs customizations online…. kind of like the equivalent of a g33k bicep flex.) I also read more about the pros/cons of Emacs vs. WinEdt. It seems like one of the big cons of Emacs for Windows users is that several of the keyboard shortcuts and/or default behaviors are so different in Emacs that it creates problems if you’re moving back and forth across tools. Sure, you could customize Emacs to get rid of some of those annoying default behaviors, but, boy, does that seem like a lot of work. And, while it’s easy in linux to install packages from the command line, in Windows it’s a several step process.

What I’ve learned so far? I could be a super g33k and use Emacs if I really wanted to, but I’m not sure I really want to. As far as tools for writing and editing text files (research articles and R or Stata code) go, there are a range of options from g33kiest (Emacs+AUCTeX, with apologies to Chris) to smart but pragmatic g33k (WinEdt) to lazy g33k (LyX, with apologies to Justin) to least g33k (MSWord). [Actually, I guess a real geek would write LaTeX so natively they could just use Notepad2, or heavens forbid, the notepad that comes with Win. Or, I guess a real geek would be using a linux box anyway, but I digress...]

So, while I am confident I could learn to use Emacs, and maybe it really would be transformative and change my life forever, but I think I’m going to be more pragmatic and move on to WinEdt for now.

Updated to add: FWIW, I’ve saved the most useful pages about setting up Emacs on Windows to my delicious.

Posted in Research | 2 Comments

Follow-up to post about LaTeX, etc.

Yesterday, I provided a rough outline of the steps for installing Emacs + AUCTeX in Windows 7.

Upon further reflection, I realized a few things about the process. First, most of the documentation for these tools is outdated and/or poorly written (at least for recent versions of Windows). Second, it’s way easier to install this stuff than the documentation would lead you to believe. Third, using the software itself is also way easier than the documentation would lead you to believe or than you might think yourself.

Specific examples?

Installing Emacs in Windows 7 is a three step process: Download file. Unzip file into preferred directory (e.g., C:\Program Files\emacs\). Set the HOME property to point to %USERPROFILE%.

Installing AUCTeX? Download file. Unzip file into the Emacs directory (above). However, if you read the AUCTeX manual, you’d think this was a 30 step process.

Everything seems to work ok. AUCTeX loads when a .tex file is being edited, and works. This setup even comes with a spell checker already installed, even though some of the [outdated] online documentation would lead you to believe that you have to install one separately. Perhaps its broken in a way I’m too n00bish to recognize, but so far, so good.

I went through the Emacs tutorial and the first 2 of these tutorials, and everything seemed pretty straightforward. These ref cards for AUCTeX and Emacs seem to have all the essentials. It’s just a matter of learning different keyboard shortcuts than the ones I’ve been using in Notepad2 or Word.

On the other hand, there’s this mystery .emacs file that is supposed to be where I put my personalized settings for using Emacs. I see an .emacs folder, but not a file. So, that’s a mystery to me, but I’m sure I can figure it out. So that’s the next step: Customize my environment, export my existing bibliography files to a .bib file, and see if I can create a custom formatting .bst, or whatever it’s called.

Posted in Uncategorized | Leave a comment

Installing and learning LaTeX

Long on my to-do list has been to use LaTeX, mainly for its BibTeX features. I’ve avoided it for so long because I’m pretty adept at MSWord. However, with Office2010 and that blasted ribbon, they broke all my keyboard shortcuts for menu items (imagine grumpy old professor shaking hand at the gods and yelling).

As a first step, I emailed Chris (not to be confused with this CL) to ask for suggestions, which he graciously provided. Then, having narrowed things to Emacs+AUCTex (Chris’s strategy) and WinEdt, I turned to FB, which yielded additional suggestions including Sublime Text and LyX. The latter is apparently a good GUI tool, but the exported raw .tex can be clunky according to those who have used it. (I imagine it’s not unlike creating a webpage in Word, which you can do, but I wouldn’t recommend.)

I decided to install Emacs, LyX, and WinEdt, but I think I’m going to start with Emacs and try LyX and WinEdt when I inevitably get stuck.

Figuring out which version and how to install Emacs also took a bit of effort, particularly because a lot of the how-tos, which may include links to packages, aren’t necessarily always linking to the latest stable version. I found these instructions to be the clearest. However, I went to the original source for the install files. It seems that AUCTeX isn’t compatible with the most recent version of Emacs, so I had to go back and re-install Emacs. Typical (for me at least).

Here are the (abbreviated) steps I went through:
.5. Read several introductions to LaTeX, including some recommended by Chris.
1. Install MiKTeX.
2. Install WinEdt.
3. Install LyX. Since I already installed MiKTeX, I had to navigate to the tex.exe file in the miktex\bin\x64 folder during LyX install. It seemed to find and get all the missing packages and automatically install them in the correct MiKTeX directory (thank goodness).
4. Install latest version of Emacs.
5. Set HOME property.
6. Install version of Emacs that works with AUCTeX.

Next step(s): Check on spellchecking features, setup preferences, etc.

Edited to add: Here’s my first document. ;)

Posted in Personal | 1 Comment

Things I have been thinking about

I’m trying to get back in the habit of posting here, even if it’s not super important, interesting, or research-related.

Yesterday, I submitted another small grant application for OPOSSEM. We need some MediaWiki development work done, and [surprise] I haven’t been able to find anyone able or willing to work for free. Hopefully, with some funds available, we can hire a firm to get the wiki-based textbook going. Also, I exchanged a flurry of emails this morning about next steps: including getting an official launch message out there, figuring out how to recognize and reward user contributions, and hopefully starting a project similar to the Wikipedia Ambassadors program, but for the OPOSSEM textbook. Of course, that will be easier once the programming is done on the MediaWiki site.

I also exchanged a round of emails with various folks about open-source publishing, not strictly OPOSSEM-related, but also in journals. It is apparently an issue that is percolating at the Canadian Federation for the Humanities and Social Sciences. A similar debate is going on in the U.S. Of course, there’s also the recent debate about Elsevier and their prices for bundled journal subscriptions. I don’t have anything to add here, but I do find the politics of this interesting. And, I do tend to think that much knowledge is publicly funded through university and targeted research funding, and therefore, the fruits of that funding should be a public good, as widely available as possible. However, faculty members tend to be a conservative bunch, and really it’s going to require academics to be a little less resistant to change and more open to new and different metrics of “quality” than whether a small number of gatekeepers at certain journals deem content to be of quality.

Finally, last week I learned something new about academic hiring practices in Canada, and I’ve been thinking about it a bit since we have a couple of on-going searches in our department right now. I knew that all academic ads in Canada include some language about first priority for Canadian (resident or citizen) applicants. Clearly, it’s not an insurmountable hurdle, since I got hired up here, and many others do as well. I had always assumed it was a sort of Canada-first employment policy that applied to all jobs. Turns out it’s not. It just (or mostly?) applies to academic jobs, which was explained to me by Daniel Béland during coffee chitchat during last week’s graduate student conference (awesome, BTW…. the US could stand to have more of these opportunities for grad students). In any event, I skimmed a couple of articles about the policy to supplement what Daniel told me, and now, I find it fascinating.

Short version: In the late 60s, a number of academics were worried about the influx of Americans into the Canadian academic job market, and [though I don't know whether it was material or cultural interests driving it] they framed a movement around the effect this was having on the content (i.e., too much American sociology, not enough CanCon) of the curriculum in Canadian universities. In 1982, the movement successfully got a provision added to immigration law to protect academic jobs from non-Canadian academics. At least one article suggests that the movement had the effect of increasing CanCon in campus curricula in the social sciences.

I’m curious about the effect the law has on the Canadian academic job market, and also on the training of Canadian PhDs. Economists certainly would have one hypothesis about the effects of protectionism on an industry, but I wonder whether that applies here. And, of course, I wouldn’t really want to touch that debate directly with a 10 foot pole. My sense is that enough universities find ways around the law if they really want to, and there is a fundamental problem in Canada of not enough growth in tenure-track jobs to satisfy the number of PhDs we produce collectively (and I do think the academy has a collective responsibility to try to employ as many of the qualified individuals we train, or stop training them). So, it’s complicated. And, interesting. And that’s all I have to say about that. :D

Posted in Profession | 4 Comments

Shameless self-promotion

The editorial assistant for the Journal of Pension Economics and Finance just sent me a copy of this book review. Detailed TOC here. Lots of economists at Big Name Schools there. And, my chapter is mentioned for its contribution. :D

Posted in Uncategorized | Leave a comment

Doyle, our NAFTA kitty

Earlier this week, we said goodbye to Doyle, our NAFTA kitty. He was almost 17 years old. Doyle was our second pet; we still have our first cat, Pannonica, who is either 17 or 18.

We got Doyle in January 1995 from the Austin Humane Society on North 183. I went looking for a cat for company for Pannonica and because she seemed partial either Brian or me and not the other. So clearly the other one of us needed a cat. Brian and I differ in our recollection about “whose cat” Doyle was supposed to be. I think Pannonica liked me, and Doyle was supposed to be a cat for him. But then, somewhere along the line, Pannonica became “his cat,” and Doyle became “mine.” He thinks this was the plan all along. In any case, we both liked Doyle when we met him. He was a tiny kitten, only 8 weeks old. He literally bounced around the little visiting room, and I just couldn’t resist him.

By Christmas 1997, Brian and I had two other pets, too, Slim (a hundred pound, sorta crazy American bulldog) and Syeeda (another short hair). We were living in Chapel Hill, NC, and I was in graduate school. That Christmas, I took all four pets with me to visit my grandparents and family in Florida. Doyle picked his first fight with Slim during that trip. Later that spring, Doyle started another fight with Slim in which Doyle ended up with a broken nose and a claw ripped off. After that, we had to keep him away from Slim because Doyle just wouldn’t back down.

That’s how it came to be that when we moved to Mexico City in 1998, Doyle came with us. My mom took care of the other 3 pets. I think Doyle liked our apartment in Coyoacan, mainly because it was the only time he was allowed outside with supervision. He was allowed in the courtyard of our small four unit building, and the first time he looked up and saw the sky and not a roof, he was visibly startled. But by the time we left, he liked laying in the courtyard sunshine. In 2001, he went back to Mexico City with us a second time.

In January 2002, we had said goodbye to Slim after a short but devastating illness, and quickly decided to get another dog, Mance. Doyle still didn’t like dogs, and sometimes didn’t seem to like the other cats either, but Mance learned his place (last) in the pet hierarchy quickly. Sometimes, it was clear that Doyle was intentionally bossing Mance around. For instance, if we were playing fetch in the house with Mance, Doyle might causally wander near the toy without looking at it or Mance, but effectively preventing Mance from getting near the toy and forcing him to look back over his shoulder at us with an expression of despair at not being able to bring us the toy. Other times, Doyle would stand right in the middle of the hallway just when I called the dog to go for a walk; he seemed to know that Mance would get stuck behind him and not know how to get around. Doyle also liked to push Mance, a 90 lb. olde English bulldog, off the communal water dish, where he would seem to drink for so long that Mance would be forced to lay down a few feet away to wait his turn. Doyle’s been our crankiest (or perhaps passive aggressive is a better way to describe it) cat for a while now.

I called Doyle our NAFTA kitty because he’s been with us to all three NAFTA countries, the only of our pets for which that is true. He’s been a part of our lives for a long time. While we miss him, we’re grateful that the end was not protracted and that he was not in pain. He got a respiratory infection, and though it seemed to be getting better with antibiotics for a few days, he suddenly stopped eating and got very weak. He was already very underweight. Force feeding him at home didn’t seem to be working, so it was time to let him go.

It’s hard to tell if the other cats miss him. They haven’t been close kitties for a while, each instead keeping to their separate spaces. Brian said Syeeda laid with him that last day on the couch, and she was sniffing his favorite spot a few days after he was gone. I can’t tell if we’re giving them extra attention because they need it or because we do.

Posted in Uncategorized | Comments closed

Ninja librarians and other forms of surveillance

Consider this just a big link dump for Brian. Now there’s an easy response when people ask what you might do with a Master of Information degree: Information Ninja!
The CIA (see also)
Ninja librarians
Sentiment analysis
World mood ring
Tracking DC homicides

Posted in Uncategorized | Comments closed