The content on this site is my own and does not necessarily represent my employer’s positions, strategies or opinions.
This is a presentation I gave yesterday at the International Association of Privacy Professionals in Washington, DC USA, on March…
This post is a great example of why you should never say that you are starting a new series of…
Several years ago I spent quite a bit of time in Second Life when it was the hot 3D social…
For a small personal project I’m starting, I wanted to get elevation data for the area surrounding our property in…
This has been a long hard winter in the northeastern United States, and I don’t think we’re done yet. Earlier…
This is a presentation I gave yesterday at the International Association of Privacy Professionals in Washington, DC USA, on March 6, 2014. This short presentation was meant to stimulate ideas that would then be complemented by discussions about privacy policies as it relates to Big Data, and in that sense is not complete regarding all aspects of privacy that come from the issues discussed.
This post is a great example of why you should never say that you are starting a new series of blog entries. In February of 2010, I wrote a blog post called Virtual Life with Linux: Standalone saying on 9.10
As a complement to my Life with Linux blog series, I’m introducing another series which explores what I can do in virtual worlds and immersive Internet environments on Linux.
I wrote two entries, and that was it. Well, here is the third entry, notes from trying to install the latest version of OpenSim on Ubuntu Linux 13.10. I’m not going to go through all the steps involved, but mostly talk about some of the glitches I encountered and how I resolved them.
First, some notes on Ubuntu 13.10. I have a dual boot pc with Windows 7 and Ubuntu on it. I used to do a lot with Linux because it was my job and also because I loved the experience of trying all the distros, seeing what was new, and playing with the features. Well, I moved on to a job involving mobile and then running the math department in, and I really did not touch Linux for a long time. Long as in the version of Ubuntu on my machine being from 2009.
I fired this up several weeks ago and started the upgrade process, which was excruciatingly slow. Somewhere in there I accidentally hit the power button on the computer and that pretty much wiped out the Ubuntu image. Don’t do that. I eventually burned a DVD of Ubuntu 13.10. Once again the updates were really slow.
This weekend I did the clever thing and did a web search for “slow Ubuntu updates.” The main suggestion was that I find a mirror closer to me, and this made a huge difference. I went into the Ubuntu Software Center, picked Edit | Software Sources, went into Download From, picked Other…, and found a mirror 40 miles from my house. Problem solved.
32 bit Libraries
I installed the 64 bit version of Ubuntu but you are going to need the 32 bit libraries. There’s a lot on the web about how to do this for older versions of Ubuntu, how you should use multiarch libraries, how you don’t need to do anything at all, and so on. Eventually I found this solution and it worked, from the forums for the Firestorm virtual world viewer. There are other ways to accomplish the same thing, but this does the job.
sudo apt-get install libgtk2.0-0:i386 libpangox-1.0-0:i386 libpangoxft-1.0-0:i386 libidn11:i386 libglu1-mesa:i386 sudo apt-get install gstreamer0.10-pulseaudio:i386
You need the complete mono package, not just what you install from the Ubuntu Software Center.
sudo apt-get install mono-complete
See the OpenSim build instructions for other platforms.
Install the client and the server from the Ubuntu Software Center. You will be asked for a root password, so write it down somewhere.
There are several ways of getting and installing OpenSim. When I last did this four years ago, I took a “from scratch” approach but I’m doing it more simply now. I used the popular Diva Distribution of OpenSim which comes set up for a 2×2 megaregion (that is 4 regions in a square that behave like one great big region). What you lose in some flexibility you gain in ease of installation and update. Once you download and expand the files, start reading the README.txt file and then the INSTALL.txt file. Other files will tell you more about MySQL and mono, but you did the hard work above.
Since I am not connecting this world to the Internet, I did not bother with the DNS name, I simply used localhost at 127.0.0.1.
Follow the instructions for configuring OpenSim and getting it started. You’ll need to give names for the four regions, which I’ll call R1, R2, R3, and R4. These are laid out in the following tile pattern:
You will need to know this if you decide to change the terrains for your world.
For example, suppose you had four terrain files called
se.raw in the
terrains subdirectory of your OpenSim
Then you would issue the following from within the OpenSim console to set the terrains for the regions:
change region R1 terrain load terrains/sw.raw change region R2 terrain load terrains/nw.raw change region R3 terrain load terrains/se.raw change region R4 terrain load terrains/ne.raw
A web search will find you many options for terrains. Basically, they are elevation files for your region.
Getting a Browser
I believe that all the popular common browsers out there for OpenSim are evolutions of some major versions of the OpenSim page has details on your options. If you have a choice, get a 64 bit browser if you are using a 64 bit Linux. I’ve had good luck with both Firestorm and Kokua.browser after they were open sourced. This
Maria Korolov extensively describes the different ways of getting an OpenSim region up and running in her article OpenSim 102: Running your own sims. In particular, she discusses New World Studio, and I’ll be trying to get that running on my MacBook.
Several years ago I spent quite a bit of time in Second Life when it was the hot 3D social world. The promise was that you could build and visit worlds that had been uniquely constructed by the users. As such, it was dynamic environment that tended to be slow as all the shapes, buildings, and textures were loaded.
People can customize their in-world presences extensively, from body shape to the clothes and decorations worn. Indeed, you don’t even need to look like a person. Note, however, that you probably should not show up to a business meeting in in the form of a squirrel, as my now-retired colleague Irving Wladawsky-Berger once said.
Over time, Second Life fell out of fashion as a world where businesses could set up sites where clients or interested people could visit, learn about products or services, and talk to real people, albeit in avatar form.
For internal business meetings, the lack of truly secure conversation was a problem. We used teleconferences for the voice, and Second Life for the environment. As meetings went on, participants often went inactive, or fell asleep, in Second Life, and we were back to phone meetings as usual.
Second Life lives on today as a social world. That’s never been much of an interest to me, but to each his or her own. It seems to be quite vibrant across a broad range of what “social” means.
My interest in it was always more in the construction aspects, and I’ve written extensively about the techniques involved. See Building in Second Life, By Example. I still get many links to this site from people looking to build moving doors, for example. I also had a long series of blog entries about how to do things in Second Life called My Second Life. Note that this is from 2006, so it is getting a bit old.
You can see all my writings on Second Life by going to the top of this page and entering “second life” in the search box on the right side.
Here is the net for me with Second Life: it is too expensive to be as slow as it is, especially if I only want to use it as an advanced 3D building environment. While new ways of building objects have been introduced, it’s hard to see a lot of difference from the way it was five years ago. I still visit from time to time, but I own no land and spend no money there.
OpenSimulator, or available.for short, is a reimplementation of the Second Life server in open source. It is written in C#, so requires Windows or the Mono environment on Linux. It does not include a browser, but several are
Other than the OpenSim site itself, the best source of information about the technology and the worlds built with it is Maria Korolov’s Hypergrid Business. It is excellent.
Some of the features of OpenSim include:
- an active development community
- better in-world programming options
- the ability to host a world on your own computer, which is completely free
- many online paid hosting options
- the ability to connect your world to several choices of “grids,” or collections of worlds
- teleporting from one world to another across a grid
This means that I could set up a world on my local computer, do all the building I want on it, save an image, and then transfer it to a hosted server. If you can and want to connect your computer to the Internet, you can host your world from there and have others visit it.
To see a modern use of OpenSim, read the article $250,000 project models cities in OpenSim.
Some of the potential downsides are:
- hosting providers come and go, though some have been around for years
- it may be more difficult to find assets you need at the quality you want, for example textures, but there are guides for finding free content
- it is probably best if you have some technical chops or know someone who does
So Second Life costs money to own land and to buy some assets, and is more restrictive. OpenSim and the worlds and grids associated with it provide more freedom, but you are more on your own and there might be some long term risks related to hosting. For me, the freedom is worth the risk.
In 2010 I wrote a blog entry called Virtual Life with Linux: Standalone OpenSim on . I’ve recently gone through the experience of doing this on Ubuntu 13.10. I’ve published some notes on what I did this time to install on my pc in 9.10Virtual Life with Linux: Standalone OpenSim on Ubuntu 13.10.
For a small personal project I’m starting, I wanted to get elevation data for the area surrounding our property in upstate New York. A quick web search yielded the The National Map website, a service of the US Geological Survey.
The information and products on the site are extensive, but for my purposes I followed the link to The National Map Viewer and Download Platform. From there I zoomed down to the area of our house, and started looking at what was available. After several experiments, I decided to download a portion of the National Elevation Dataset at 1 arc second resolution. The 1/3 arc version was also available, but, as expected, was 9 times bigger.
The readme.pdf file starts with the following:
The U.S. Geological Survey has developed the National Elevation Dataset (NED). The NED is a seamless mosaic of best-available elevation data drawn from a variety of sources. While much of the NED is derived from USGS Digital Elevation Models (DEM’s) in the 7.5-minute series, increasingly large areas are being obtained from active remote sensing technologies, such as LIDAR and IFSAR, and also by digital photogrammetric processes. Efficient processing methods were developed to filter production artifacts in the source data, convert to the NAD83 datum, edge-match, and fill slivers of missing data at quadrangle seams. NED is available in spatial resolutions of 1 arc-second (roughly 30 meters), 1/3 arcsecond (roughly 10 meters), and 1/9 arc-second (roughly 3 meters). The dataset is updated with “best available” elevation data on a two month cycle.
These digital elevation datasets are essential in understanding the Earth’s landscape: elevation, slope, and aspect (direction a slope faces.) NED is critical to identifying and modeling geologic features such as water drainage channels and basins, watersheds, peaks and pits, and movements such as avalanches. NED is used to create relief maps, 3-D visualizations, to classify land cover and to geometrically correct data from satellite or aircraft sensors (orthorectification). The fire community, natural resource managers, urban planners, conservationist, emergency responders, communication companies to name a few all rely on these elevation datasets. This data also supports The National Map.
Now I have to figure out how to process the file, which I’ll do by looking at the data dictionary elsewhere on the site and writing somecode.
Update: Even though I zoomed down to a rectangular area less than one block on a side, the downloaded data contains a 1 second by 1 second square of elevation data.That’s more data than I was expecting and I’ll have to pull out a subset.
This has been a long hard winter in the northeastern United States, and I don’t think we’re done yet. Earlier this week New York and Philadelphia got several inches of snow topped by ice. More upstate in New York where I live, we only got light fluffy snow, but we got a foot of it.
Being February and knowing that we have close to two months before we are likely to see the last of the snow, my thoughts often go to summer. In particular, to summer in the Adirondacks where we have a family place on Cranberry Lake.
I try to get up there in April to check the place out after the long winter, looking for downed trees and telephone lines. This is what it looked like last April 15:
The left front corner of the dock is pushed into the ice, which was still probably 8 to 10 inches thick. Later in the spring, the dock broke loose on the right side and swung around way to the left. I’m hoping that does not happen when the ice goes out this year.
In contrast, this was the dock a year and a day before the above photo, on April 14, 2012:
I don’t recall the exact temperature, but I’m guessing it was in the upper 40s or lower 50s F. The ice had been gone for nearly a month.
With our winter so far this year, I’m suspect that this April will look like last year at the lake. Being an optimist, I’m pulling for the warm, sunny, ice-free version.
The postings on this site are my own and don’t necessarily represent my employer’s positions, strategies or opinions.
Blog entries before 2010 are in my Archived Blog.
For several weeks my wife has been asking me to convert our old Proform treadmill into a treadmill desk. The idea, exercise-wise, is to be on the treadmill for several hours a day at speeds less than 2 mph.
This weekend my son and I did it, and the conversion was straight-forward. To start, we moved the treadmill into my wife’s home office and then removed the walking-pole like handles and the decorative guards where the hand supports connected to the treadmill console. This allowed the treadmill to fit in the space better and allowed us more access to attach the desk.
I then cut a 12 inch deep piece of 3/4 inch birch plywood that I happened to have in my shop, and cut it to the outside width between the handles. This is the actual desk and the 12 inches was sized to the necessary dimension to fit my wife’s MacBook Pro. I rounded off all the edges with a router and then sanded them to make them as smooth as possible.
To mount the desk, I drilled holes in the corners of the plywood approximately 1 inch in from each side. To make these smoother, I used a counter sink on both sides of each hole. I then use plastic corner ties to attach the wood to the bottom of the treadmill handles. The tie connectors were on the bottom side of the wood and the excess plastic straps were trimmed.
This mounting position was approximately the right the height for my wife. If you need it lower, you can put wooden strips or blocks between the desk and the handles. Similarly, if you want it higher, you can mount it on the top. The idea is to have the laptop keyboard at a comfortable height that you can use while walking.
The laptop screen is too low for comfortable use, so I added a 19 inch Samsung LED TV with an HDMI input and mounted it (VideoSecu TV Wall Mount Articulating Arm Tilt Swivel Bracket) on the bookcase/wall in front of the treadmill. While the mounting hardware is adjustable, I suspect I will be raising it an inch or two as my wife gets more experience in using the desk. It should be at a comfortable height so you can look straight ahead and not crane your neck.
Possible things yet to do: put a couple of coats of water-based polyurethane on the wooden desk, raise the TV, and strap down all cables.
This is a Dylan classic from 1965′s Bringing It All Back Home album.
This first video of the song is from the 1964 Newport Folk Festival where Dylan is introduced by Pete Seeger.
Move forward 26 years to 1990 and we have Dylan joining many of the original Byrds performing the song at a Roy Orbison tribute concert.
I’ve never been that thrilled with the Byrd’s versions of Dylan’s songs, but this is a fun take on it. There’s a lot of energy and excitement in the crowd when Bob walks out a couple of minutes into the song. Rolling Stone magazine has an article about the reunion.
Previous: Thunder on the Mountain
The postings on this site are my own and don’t necessarily represent my employer’s positions, strategies or opinions.
Blog entries before 2010 are in my Archived Blog.
I think this song has some of the best lyrics from the most recent phase of Dylan’s career:
I got the porkchops, she got the pie
She ain’t no angel and neither am I
Shame on your greed, shame on your wicked schemes
I’ll say this, I don’t give a damn about your dreams
Still Dylan being warm and fuzzy. Also, I wouldn’t have thought of the rhyme for “orphanages” that he uses here.
The video is a collection of Dylan in performance and snippets from other videos through the years. It appears to be an official release from SONY and not a YouTube contributor amalgam.
As a much later counterpoint to the album and song The Times They Are A-Changin’, this song was part of the soundtrack for The Wonder Boys film soundtrack in 2000 and is on the The Essential Bob Dylan album. I like it because of the song itself but also because it shows the significant change in direction Dylan took his music when he got older.
I’ve been coding, a.k.a programming, since I was 15 years old. Since then I’ve used many programming languages. Some of them have been for work, some have been for fun. I mean, really, who hasn’t done some programing while on vacation?
Somewhat chronologically, here are many of the languages I’ve used with some comments on my experience with them. In total I’ve written millions of lines of code in the various languages over four decades.
Basic: This is the first language I used. While primitive, I was able to write some long programs such as a Monopoly game. In between coding sessions, I saved my work on yellow paper tape. I fiddled with Visual Basic years later, but I never wrote anything substantive in it.
APL: Now we’re talking a serious language, and this is still in use today, particularly by one statistician in my group at . I was editor of the school newspaper when I was a senior in high school and I wrote a primitive word processor in APL that would justify the text. It sure beat using a typewriter. Some modern programming languages and environments like R and MatLab owe a lot to APL. They should mention that more.
FORTRAN: My first use of this language was for traffic simulations and I used a DYNAMO implementation in FORTRAN in a course I took one summer at the Polytechnic Institute of New York in Brooklyn. Forget interactive code editing, we used punch cards! FORTRAN was created at Research, by the way.
PDP 11 Assembler: I only took one Computer Science class in college and this was the language used. Evidently the course alternated between using Lisp and Assembler and the primary language in which the students wrote. However, our big project was to write a Lisp interpreter in Assembler which got me hooked on ideas like garbage collection. No, I did not and do not mind the parentheses.
csh, bash, and the like: These are the shell scripting languages for UNIX, Linux, and the Mac. I’ve used them on and off for several decades. They are very powerful, but I can never remember the syntax, which I need to look up every time.
Perl: Extraordinary, powerful, write once and hope you can figure it out later. Just not for me.
PL/I: Classic IBM mainframe language and it saved me from ever learning COBOL. When I was a summer student with IBM during my college years, we used PL/I to write applications for optimizing IBM’s bulk purchases of telecommunications capacity for voice and data. It was basically one big queuing theory problem with huge amounts of data. It was big data, 70s style.
Rexx: This language represented a real change in the way I viewed languages on the mainframe. Rather than being obviously descended from the punch card days, it was a modern language that allowed you to imagine data in more than a line-by-line mode, and help you think of patterns within the data. It was much easier to use than than the compiled languages I had used earlier. My primary use for it was in writing macros for the XEDIT editor.
Turbo PASCAL: This was my main programming language on my IBM PC in the 1980s. The editor was built-in and the compiler was very fast. I used it to write an interactive editor like XEDIT for the mainframe with it, as well as a Scheme interpreter.
Scheme: A very nice and elegant descendant of Lisp, was considered an important programming language for teaching Computer Science. That role has been largely usurped by Java. I liked writing interpreters in Scheme but I never did much actual coding in it.
VM Lisp: This was a Lisp dialect developed at IBM Research for mainframes. My group led by Dick Jenks there used it as the bottommost implementation language for computer algebra systems like Scratchpad, Scratchpad II, and Axiom. Like other Lisps this had two very important features: automatic garbage collection and bignums, also known as arbitrarily large integers.
Boot: An internal language at IBM Research built on Lisp that provided feature like collections and pattern matching for complex assignments. It had many advantages over Lisp and inherited the garbage collection and bignums. From time to time I and others would rewrite parts of Boot to get more efficient code generation, but the parser was very hard to tinker with.
Axiom compiler and interpreter languages: The IBM Research team developed these to express and compute with very sophisticated type hierarchies and algorithms, typical of how mathematics itself is really done. So the Axiom notion of “category” corresponded to that in mathematics, and one algorithm could be conditionally chosen over another at runtime based on categorical properties of the computational domains. This work preceded some later language features that have shown up in Ruby and Sage. The interpreted language was weakly typed in that it tried to figure out what you meant mathematically. So
x + 1/2 would produce an object of type
Polynomial RationalNumber. While the type interpretation was pretty impressive, the speed and ease of use never made the system as popular as other math systems like or .
C: Better than assembler, great for really understanding how code translates to execution and how it could get optimized. Happy to move on to C++.
C++: Yay, objects. I started using C++ when I wrote techexplorer for displaying live TEX and LATEX documents. I used the type system extensively, though I’ve always strongly disliked the use of templates. Several years ago I wrote a small toy computer algebra system in C++ and had to implement bignums. While there are several such libraries available in open source for C and C++, none of them met my tastes or open source license preferences. Coding in C++ was my first experience with Visual Studio in the 1990s. The C++ standard library is simply not as easy to use as the built-in collection types in , see below.
SmallTalk: Nope, but largely because I disliked the programming environments. The design of the language taught me a lot about object orientation.
Java: This is obviously an important language, but I don’t use it for my personal coding, which is sporadic. If I used it all day long and could keep the syntax and library organization in my head, that would be another story. I would be very hesitant to write the key elements of a server-side networked application in something other than Java due to security concerns (that is, Java is good).
Ruby: Nope. Installed many times, but it just doesn’t make me want to write huge applications in it.
PHP: The implementation language for and , in addition to many other web applications. If you want to spit out HTML, this is the way to do it. I’m not in love with its object features, but the other programming elements are more than good enough to munch on a lot of data and make it presentable.
Objective-C: Welcome to the all world, practically speaking. It hurts my head, but it is really powerful and Apple has provided a gorgeous and powerful library to build Mac and iOS mobile apps. My life improved when I discovered that I could write the algorithmic parts of an app in C++ and then only use Objective-C for the user interface and some library access.
Python: This is my all time favorite language. It’s got bignums, it’s got garbage collection, it’s got lists and hash tables, it can be procedural, object-oriented, or functional. I can code and debug faster than any other language I’ve used. Two huge improvements would be 1) make it much easier to create web applications with it other than using frameworks like Django, and 2) have Apple, , and Microsoft make it a first class language for mobile app development.
Several years ago I wrote a blog entry listing what I thought were the albums to start with if you were building your Bob Dylan album library. Yesterday I updated that post with correct links and one extra album.
Today’s song is from the third album on the list, Blood On The Tracks. This is probably my favorite Dylan song, and it’s not just for the mention of mathematicians toward the end.
There’s a bootleg version of Blood On The Tracks you might be able to find that’s called Blood on the Tapes. It has early and alternative versions of the songs on the published album.
Next: Things Have Changed
This weekend I was thinking about a post in my old archived blog where I made suggestions about what Bob Dylan albums to start with if you were building your musical library of his work. In that post I restricted myself to one album per decade, with one exception. When I looked at that old blog entry, I discovered that almost all of the links were broken, mainly because the structure of bobdylan.com had changed dramatically. So I went through, and fixed all the links.
Since I went to all this trouble, I decided to repost the blog entry with one new addition, the mention of Biograph, which appears at the bottom below.
I’ve been mulling this over for a while. If you were to start a Bob Dylan musical library, how would you do it? There are over forty albums, plus all sorts of illegal bootlegs. Rather than just say “start with these twenty,” I wanted to deal with a more manageable number. After weeks of consideration, mostly while I was driving, this is what I came up with. It’s “one per decade plus another for the Sixties.”
There are many variations on this and some might complain that I’ve skipped important eras such as the late Seventies/early Eighties Christian era. All things in their time. Once I’ve established this initial list, we’ll branch out from here, looking at albums that are either close in time to these or else not otherwise represented.
This list does not include all the albums that I would consider masterpieces, but it’s a start. If you don’t have any of these, I would recommend you get them in the chronological order listed. Things to which you should pay attention: lyrics (always), simplicity or complexity of musical arrangement, quality of voice, and overall album consistency. I’ve included some free form comments after each album.
Thanks to my daughter Katie, the best Dylanologist I know, for her input on this, though any errors and opinions are mine alone.
The Freewheelin’ Bob Dylan (1963)
Second album, first masterpiece. All songs are now classic but particularly note “Blowin’ in the Wind,” “Masters of War,” “A Hard Rain’s A-Gonna Fall,” and “Don’t Think Twice, It’s All Right,” and that’s just half way through the album. That’s Suze Rotolo on the cover with Dylan in Greenwich Village in New York City.
Highway 61 Revisited (1965)
Yet another masterpiece, perhaps the best of them, and only two years after Freewheelin’. Rolling Stone magazine named the song “Like a Rolling Stone” its number one song of all time. This album features the electric blues guitar of Mike Bloomfield. Dylan has said that he regrets that he couldn’t get Bloomfield for later albums. He also jokingly suggested that “Desolation Row” should be the new national anthem.
The album is squarely in the middle of the whole acoustic-folk-to-electric-rock controversial period for Dylan that is documented in the video No Direction Home.
Blood on the Tracks (1975)
For many people, this was Dylan’s first “hit album” since Blonde on Blonde in 1966, but that’s skipping over a lot of excellent work that I’ll get to in future entries in this series. If you don’t own the album, you have probably at least heard the opener “Tangled Up in Blue,” “Shelter from the Storm,” as well as Joan Baez’ cover of “Simple Twist of Faith.”
The song “Idiot Wind” stands out as one of the best put-down songs (perhaps mutual), along with Dylan’s single “Positively 4th Street” that’s on Biograph, a collection from 1985. (4th Street is in New York City in Greenwich Village.)
Much of this album was recorded twice, once in New York City and then again in Minneapolis, with the final album primarily from the latter sessions. I’m told that the bootleg album Blood on the Tapes contains some of the earlier versions.
Oh Mercy (1989)
With this album we jump forward fifteen years and almost out of the Eighties. It was produced by Daniel Lanois and was, according to Dylan’s Chronicles rather difficult at times to construct and record. The first song, “Political World,” really sets the tone and feel for the album and is unlike anything I had heard previously from Dylan.
You may have heard the song “Man in the Long Black Coat” on Joan Osbourne’s album Relish. For another Lanois-produced album with a similar musical atmosphere, see Emmy Lou Harris’ album Wrecking Ball. She covers an earlier Dylan song, “Every Grain of Sand,” from the Dylan Gospel-era album Shot of Love.
Time Out of Mind (1997)
Many people think of Dylan’s career as being anchored in a series of album trilogies and, if this has any basis in fact, this might be the first album in the latest trilogy. (As with all such statements, this is denied by Dylan, according to Wikipedia.) It was the second album produced by Daniel Lanois and it won the Grammy for Album of the Year. It is strong from start to finish.
If “Dirt Road Blues” doesn’t get you moving, I don’t know what will.
Dylan is back in full poetic form on this album, as in “Tryin’ To Get To Heaven”:
People on the platforms
Waiting for the trains
I can hear their hearts a-beatin’
Like pendulums swinging on chains
When you think that you lost everything
You find out you can always lose a little more
I’m just going down the road feeling bad
Trying to get to heaven before they close the door
The blues live.
Modern Times (2006)
(Remember I first wrote this in 2007 …) As I write this entry, this album is the latest by Dylan and returns in full force to the quality of Time Out of Mind. Unlike that album but like the intermediate “Love and Theft”, Dylan self-produced this under the pseudonym “Jack Frost.” I can’t recall seeing one bad review of this album, and I concur.
The kickoff song, “Thunder on the Mountain,” is notable for many reasons, including his interesting rhyme for “orphanages,” which I suspect is a first. When Katie and I saw Dylan in Boston in late 2006, this was the first song in the encore.
I think the most poignant song here is “Workingman’s Blues #2″, and recalls, in my mind, some of Dylan’s early influences such as Woody Guthrie.
This last album was not in my original 2007 blog entry but I’ve added it for one reason: if you only own one Bob Dylan album, this should be it. Also, if you own all the others, get this too.
Modern Times (1985)
This album contains 53 Dylan classics from 1962 through 1981, though perhaps not in the versions you’ve heard. Many are straight from the albums, but quite a few are from concerts, and the differences are interesting. Dylan fiddled endlessly with arrangements and sometimes lyrics, but none of these are jarring. When it was released on CD, it took up three discs. I suggest you listen to it all 3 times through, and it’s a great accompaniment for a long car trip.
The postings on this site are my own and don’t necessarily represent my employer’s positions, strategies or opinions.
Blog entries before 2010 are in my Archived Blog.
In the last few months I have given several talks to students getting graduate degrees in fields that involve analytics. For many of these students, their first question is “How do I get a job?”. Once we move beyond that, I talk about where analytics is used in companies. As exhaustive list would be exhausting, so let me give you some ideas and how you should think about them.
Let’s first look at this in one dimension. On the left we have some of the academic or technological disciplines that make up the broad field of what we call analytics.
You could study these and through examples learn some of the applications. With this approach you might say “I am an Operations Research expert. What are the various fields in which I could work that use analytics?”.
On the right, I list some of those fields of application. If you start on the right, you might ask “I am a supply chain expert. What disciplines within analytics should I learn?”. Those are on the left, though you may not need all of them.
A better way of viewing this is in two dimensions.
Here you get a better idea that the disciplines can be used in different application areas and the application areas need various technical expertise. You might go on from here and weigh the intersection points to understand if, say, Machine Learning is more important for Pricing or Supply Chain.
Here’s my advice:
- Among the disciplines, decide which you love the most and have the best aptitude. Go deep on those, but learn enough about the others so you know when a given solution will require them.
- If you are working on a team, seek out others who have skills that complement your own (this is good advice in general).
- If you are working in an application area, understand that broadly and know how the disciplines are used in each. Become expert in one or two of the disciplines but over the course of your career, learn more and more about the adjacent fields and pick up those skills.
- It is likely that if you start in one application area, you will be employed in another within 3 to 5 years.
- Shift jobs within your organization or between organizations to learn more disciplines and application areas. Beware becoming a mile wide and an inch deep: truly become an expert in some of the areas of technology and use.