Sunday, December 19, 2010

My First Earthquake

I've been studying faults and earthquakes for over a year now, but it was not until just now that I felt my first trembler. I'm currently visiting my fiancé's family in Los Altos, CA. We were all sitting around the breakfast table when all the sudden the plates all started rattling. It was very brief, only a couple seconds long, but it was definitely an earthquake.

The USGS has it initially sized at a magnitude (local magnitude, ML) of 3.1

Donate to charity simply by browsing the web

Google Chrome is now offering an extension that tracks the number of tabs you browse in a day. Google will donate an as-yet-unspecified amount of money per tab viewed, up to 250 tabs a day per user, up to 1 million dollars. The effort started Wednesday and stops today, but I just now learned of it now. Grab it while you can and spend the day helping charities as you browse the web.

Thursday, December 2, 2010

CCOM/JHC seminars now offered as webinars!

The CCOM/JHC seminar series, that occurs each fall and spring semester, and pulls in some fantastic speakers, is now offered as a webinar so anyone, anywhere can join in.

Each week, generally by Wednesday (and certainly by Thursday), you will see a headline on CCOM's homepage of their website advertising the coming Friday's seminar, along with a link that allows you to register for the webinar. The webinars will play in almost any browser on a Mac or PC. For a full list of requirements, view here and click on "attending a webinar."

If you attended our very first attempt at this, Friday, Nov. 19th, you know we had some "technical difficulties." Our speaker was using Keynote on her Mac, and we did not realize that by default Keynote is set to not allow screen sharing. As a result, our webinar had audio, but not video. We now have all the kinks worked out and tomorrow's webinar so hopefully go off without a hitch, video, audio, and all. You'll even be able to text in questions!

Tomorrow's seminar: Multi-Source Geospatial Information Integration and Analysis for Coastal Management and Decision Making by Ron Li, Director of the Mapping and GIS Laboratory at Ohio State University. Just click the title to register.

BTW, if you want to know how to make Keynote and GoToWebinar play nicely (in case you're ever giving a webinar), check out the super quick solution here, you just have to change one preference.

Tuesday, July 20, 2010

Digital Mapping Camp for Kids

This is very, very cool! I hope when I have kids, I'll be able to bring them to conferences as well, and while the adults sit in talks, the kids can learn how to put together bathymetric images of the seafloor!

For Kids Of Cartographers, Digital Mapping Class Is In : NPR

Wednesday, July 14, 2010

Free Amazon Prime for Students

Amazon Prime is normally 79 bucks a year. For that, you get free two-day shipping on any eligible item without having a required minimum purchase. You can also upgrade to 1-day shipping for only 4 bucks an item.

This sounds great, but not enough that I am willing to fork over the 80 bucks to get it. Well, now I don't have to. Amazon is giving away free 1-year memberships to Amazon Prime to all college students currently enrolled in at least one course (this means waiting until fall semester for me...). In the fine print, it also mentions that this offer may be extended beyond the initial 1-year period. In addition to the free shipping, Amazon will also email out exclusive student-related discounts and promotions, though you can opt out of these emails at any time.

The idea of all this, of course, is to get more students buying things like textbooks from Amazon. Given that Amazon pretty much ALWAYS has the cheapest prices for new textbooks - I'm anal and prefer new to used - I generally always order through them anyway.

Check out this free Amazon Prime offer yourself: Amazon Student.

Tuesday, July 6, 2010

Holy Guacamole: ArcGIS for iPhone/iPad..Free!

The title pretty much says it all!

You can view maps available from, including ones that you upload yourself. This means you can access your own GIS data. They also provide tools to digitize and measure your own routes, right on the iPhone or iPad!

Here a link to a blog post with some more info:

Here is the iTunes Store link:
iTunes Store

Tuesday, June 15, 2010

Italian earthquake scientists indicted for failing to predict quake

Excert from an email I received from the Seismological Society of America (SSA):

Two weeks ago the L’Aquila Prosecutor’s office indicted of manslaughter the members of the National High Risk Committee that met in L'Aquila one week before the Mw6.3 earthquake. The charges are for failing to provide a short term alarm to the population before the earthquake struck, killing more than 300  people.
The president of INGV (National Institute of Geophysics and Vulcanology), Enzo Boschi (member of the High Risk Committee), and the director of the National Earthquake Center, Giulio  Selvaggi (just accompanying Boschi to the meeting as technical  specialist), are among the scientists in seismology and earthquake  engineering now under investigation together with some civil protection officials.
 This is insane. Earthquake prediction is notoriously difficult and there is no current accepted method for predicting an event. The best scientists can do now is to develop seismic hazard maps and risk assessment in order to help guide better building codes, train response teams, and help prepare the community as much as possible. The SSA has drafted an open-letter addressed to the President of the  Italian Republic, and asks anyone working in seismology or the Earth sciences in general to sign the letter and show your support for these scientists. 

The letter can be found here: open-letter in support of Italian scientists

M5.7 Quake shakes up my folks

Ahh... nothing like waking up in the morning and finding a Facebook post from your dad mentioning they just got rocked by a M5.7 earthquake. At least I know they are alright if they can post a status update about it. That's a pretty decent sized quake, and they said they felt the rolling for about 10 - 15 seconds. Luckily they are far enough away from the epicenter, near Mexicali, CA, that most of the higher frequency energy was already absorbed and they just felt the lower energy stuff.

Here's a link to the USGS webpage for the quake: USGS event map. If you live near the area and felt the quake, be sure to go to the USGS link provided and fill out their shaking intensity form. Scientists use this data to determine the shaking intensity of the event and it helps provide data in areas that may not be covered by instrumentation.

Monday, June 14, 2010

Track Oil Spill Response Vessels in Real-Time

Want to see what all the response vessels are doing in the Gulf of Mexico?

Go to the government's Environmental Response Management Application (ERMA) website. This site leads you to a Web-based interactive GIS platform where you can turn on various layers such as "Response Vessels Snapshot" and the National Marine Fisheries Service (NMFS) fisheries closures. There is currently a typo in the Response Vessels Snapshot layer, where it says the snapshot is from June 10. Don't let this throw you off, these snapshots are real-time, with the exception of an approx. 10-minute delay while data is uploaded to the server, processed, and loaded into Erma.

Green triangles represent research vessels, including NOAA ships, and blue represent other response vessels. You can click on the information icon (the "i"), and then click on individual vessels for more information.

Wednesday, June 9, 2010

A Blogger Worth Following

While Googling how to set a certain preference in Emacs, I came across Sacha Chua's website. Sacha is an Enterprise consultant for IBM and she helps people learn how to use wikis, blogs, etc and be more productive as a result. She blogs about her work, geeky-like things such as Org-mode and Emacs, and life in general. I really dig her site and what she is about in general, and I recommend that folks check her out.

I wonder if Blogger has a way for me to organize my posts in clickable category links... I have the labels widget to the side that uses keywords to organize your posts, but it would be nice to have major categories as their own clickable links like Sacha does, where then each category has its own set of labels.

Time Machine not Working after Logic Board Replacement

I just got my Macbook back from Apple this morning with a new logic board and new screen. Hooray!

I connected my Time Machine drive and immediately noticed a problem. Time Machine could not see any of my previous backups. When I browse the drive using Finder, I see them, but not with Time Machine. Turns out that Time Machine is closely coupled with your MAC address in 10.5.8 (not so in 10.6 Apple told me, so you 10.6-ers will not face this issue). Anyway, new logic board = new MAC address = unhappy Time Machine.

There is supposedly a fix that involves resetting the attributes of a plist file and renaming a hidden file in Time Machine. This method is detailed on the MAC OS X Hints website. The Apple tech told me that Apple engineers say this fix should work, but he has never had success personally. One little typo and it won't work.

I followed the directions to a "t," and it did not work for me. I was very careful about the spaces in my Time Machine backup disk name, but something still went wrong. Anyway, the old backups are still there, I just still cannot see them with Time Machine. Although now when I follow the Mac OS X Hints tips for finding the mac address associated with Time Machine, it shows the correct one.

Apple's suggestion, assuming you are not missing any major files on your system: Abandon your old backups and start fresh with Time Machine. Either that or update to 10.6 (which for me would involve purchasing a new version of Matlab, so no thank you).

Looks like I'll be starting over with Time Machine tonight, *sigh*.

Monday, June 7, 2010

Another reason to use a version control system

I have posted about usefulness of Subversion version control system (svn) a couple of times already, but this morning has prompted me to do another one. Subversion allows you to backup all your valuable data, saving each commit as a separate version. You can see the differences between any two versions, checkout earlier versions, have multiple working copies, etc. If you are a coder, a version control system like svn is essential.

Anyway, this morning made me love svn even more. My main computer is being shipped off for repairs, and I am currently using the boyfriend's old Mac PowerBook G4. I needed to get it setup with all my files so I can continue to do my work. Instead of having to transfer files over one-at-a-time from one computer to the other, I simply did an svn "checkout" of what I needed.

A simple "svn co https:\\my repository path\trunk\Code" and "svn co https:\\my repository path\trunk\References" at the terminal prompt (from within the directory I want to download them to) and I am good to go. All my latest scripts, and my complete reference directory (all the pdfs of all the journal articles I use as references) are now on this computer, within seconds. If I make any changes to these files while on the boyfriend's computer I can simply "commit" them to my svn repository and when I get my computer back, do an "update" and I am all set!

JabRef has also proven to me, once again, its awesomeness this morning. I simply open up my main library.bib bibtex file (just checked out from svn) within JabRef, and there are all my journals, instantly organized and searchable. The PDF links work automatically, simply click on the icon, and the journal articles pops open before me in Acrobat Reader. To see why JabRef is such an awesome reference database software, check my initial blogpost on it here, and the follow-up (showing how, in conjunction with Zotero, it can make organizing and collecting references a breeze) here.

Friday, June 4, 2010

The Topsy Turvy Experiment

One Topsy Turvy, One Tomato Plant, One Summer....

I am not sure what it is about the Topsy Turvy that has me so fascinated, but for some reason I could not resist buying one when I saw them on the shelf in Rite Aide. Perhaps it is the pleasure of being able to grow fresh tomatoes without having to deal with all the slugs our garden attracted last year, or the idea that this plant may escape the blight that claimed the lives of so many of our brave, young tomatoes last year. Whatever it is, I gave in to temptation. I also decided that this would make a fun topic for a photo log. Throughout the summer I'll occasionally post about the progress of this little tomato plant in the Topsy Turvy. If you've thought of buying one yourself, you may want to stay tuned...

(Topsy Turvy and Purple Cherokee Heirloom Tomato plant)

(Topsy Turvey hanging off our side porch)

Thursday, June 3, 2010

ERMA tracks the oil spill through web-based GIS

Some of my fellow CCOM-ers have been helping out with the oil spill mediation attemps, and now that some of it has been made public, I can finally share!

ERMA, which stands for Environmental Response Management Application, is now up and running in a web-based GIS portal that allows users to turn on and off different layers. You can view where the spill is now, predicted trajectories, current weather conditions, etc. While not everything in ERMA is viewable yet by the public, what is available is pretty cool and starts to gives you an idea of all the things that going into an emergency disaster response such as this one.

Tuesday, May 25, 2010

Free SQLite Manager for Mac OS

Sqlite-manager is a tool that originated as a Mozilla (Thunderbird, Lightening, FireFox) plug-in for managing SQLite databases. However, over at the Kiveo website blog, I found instructions for making this plug-in a stand-alone application.

I followed the directions exactly as posted (note though, that the current version of the sqlitemanager-xr packageis now 0.5.16) and it works perfectly.

Here's a running list of the current features:

I followed the directions exactly as posted (note though, that the current version of the sqlitemanager-xr packageis now 0.5.16) and it works perfectly.

Here's a running list of the current features:
    • dialogs for creation, deletion of tables, indexes, views and triggers
    • ability to rename, copy, reindex tables
    • ability to add and drop columns
    • create new db, open any existing db, copy an existing db
    • supports writing your own queries (single or multiple)
    • supports saving the queries with a name
    • a tab for database settings (no need to write the pragma statements) where you can view and change the sqlite library settings
    • export tables/views as csv, sql or xml files
    • import tables from csv, sql or xml files
    • a dropdown menu showing all profile db (.sqlite)
    • an intuitive heirarchical tree showing all tables, indexes, views and triggers
    • ability to see the master tables
    • ability to see the temporary tables, indexes, views and triggers
    • ability to browse data from any table/view
    • dialogs to allow searching in a table/view
    • allows editing and deleting selected record while browsing a table's data
    • allows adding, saving and changing blob data
    • an extensive menu that helps with writing sql by hand and then executing it
    • remembers the last used db, table and the tab (structure, browse & search, etc.) across sessions

                                        And here's a screenshot of it running happily on my Mac (OS 10.5.8)

                                        Thursday, May 20, 2010

                                        Radar for accurate draft measurements

                                        Here at CCOM, we have been testing a small radar for use as an accurate tide gauge out in the field. The WaterLog Series radar has an accuracy of +/- 3mm and and typical report time of 800 milliseconds, which is pretty snazzy.

                                        (Radar mounted above CCOM's wave tank)

                                        I was thinking it would be pretty sweet if we could mount one of these radars on the sonar mount of the R/V Coastal Surveyor, CCOM's main survey ship. Right now we have to take draft measurements by holding a yard stick up to a clear tube and measuring the distance between the meniscus of the water in the tube and the top of the Inertial Motion Unit (IMU). This leaves A LOT of room for human error.

                                        (R/V Coastal Surveyor, with the hydraulic ram out of the water)

                                        On the Surveyor, sonars are mounted to the bottom of a hydraulic ram that moves up and down. Every morning prior to surveying, surveyors measure the height of the ram above a specific point on the deck of the ship. This is then used to calculate the depth of the sonar below the IMU since you know the total length of the ram, and you know the height of the IMU in relation to the point on the deck that you measure the ram height to. This means, each day, we would also know the height of the radar in reference to the sonar as well. We could simply take a running average of say, a minute of data, and get a much more accurate draft than we could ever measure with a yard stick. Furthermore, we could get continuous draft measurements throughout the day, which is something you cannot readily do now, barring someone taking several yard sticks measurements in a row and hoping that the average somehow actually represents the real draft. Plus, trying to measure a meniscus in a tube with a yard stick, on a small moving vessel that is rocking in waves is not something I particularly want to do.

                                        Wednesday, May 19, 2010

                                        Typo perhaps not Google's after all...

                                        So yesterday I posted about how I had found a typo in the name of a fracture zone in Google Earth's Ocean layer.

                                        The Vernadsky fracture zone (~ 7º10'43" N and 34º35'05"W) is spelled "Vernadskiy" in Google Earth.

                                        The IHO-IOC GEBCO Gazetteer of Undersea Feature Names (2009) lists the spelling as Vernadsky, and I took this as gospel, since this document tends to be to go-to document to check the name of any feature on the seafloor.

                                        The feature is named after the very famous Russian geochemist Vladimir I. Vernadsky, considered one of the founders of geochemistry. I did a search and actually found several references to him using both spellings, including this one:

                                        Vernadksiy Tavricheskiy National University in the Ukraine. Now I would imagine that if they are going to name a university after someone, they make sure to spell the name right.

                                        So perhaps the misspelling is in the actual Gazetteer and not Google. This guy is pretty famous, he is considered one of the founders of geochemistry, so you would think this would be easier to sort out.

                                        I suppose it could be due to the westernized spelling of an eastern name, in which case, the eastern spelling (Vernadskiy) should be the correct one.


                                        Tuesday, May 18, 2010

                                        Google Earth Typo

                                        I was flying through Google Earth earlier looking at some oceanic transform faults when I came across something that gave me pause: a typo. I never really expected to see a typo in Google Earth. I posted it to their user help forum, but I am not sure if the right people will see it.

                                        The misspelled feature is a fracture zone in the Google Earth ocean layer. The Vernadsky Fracture Zone (~ 7º10'43" N and 34º35'05"W, named after the very famous Russian geochemist Vladimir I. Vernadsky) is spelled "Vernadskiy" in Google Earth.

                                        I checked the IHO-IOC GEBCO Gazetteer of Undersea Feature Names (2009) just to be sure, and it lists the correct spelling.

                                        Here's a picture of the offending fracture zone:

                                        Make Your Facebook More Secure

                                        A friend pointed me tool a great little tool that helps you lock down your Facebook and make your personal information more secure. has developed a scanner tool that will check your privacy settings and show you all your vulnerabilities. Best of all, it works on any browser and on any system. Simply drag their tool to your toolbar to make a bookmark (or bookmark using the menu options in your browser), navigate over to Facebook, and then click on the "Scan for Privacy" bookmark link. Below is an example of the result:

                                        Wednesday, May 5, 2010

                                        Google Maps of the Oil Spill

                                        Google is tracking the oil spill in the Gulf of Mexico. Check out this weblink to their map as well as some KML files for Google Earth

                                        Tuesday, April 13, 2010

                                        Two Must-Have Mac Tools to Manage Hard Disk Space and Time Machine

                                        So earlier today I had a bit of a freak-out! Inexplicably, the remaining 25 GB or so of free disk space on my Mac disappeared. I got a little pop-up warning telling me my disk was full, and sure enough when I checked I only had about 150 MB left. I immediately assumed the worst, that my drive was corrupted, my computer was going to die, all my work would be lost, I would subsequently fail out of my PhD, and that I would end up working at McDonald's. OK, this might be a stretch, and in any case, I have yesterday's Time Machine backup, but I was still freaked out. How do you suddenly lose 25 GB of free space? I noticed it after plugging in my Time Machine disk and letting a backup start. I checked my drive but saw nothing out of the ordinary. On my boyfriend's advice, I let Time Machine finish the backup. This way I could hopefully do a diff between two backups and find out what was eating up the hard disk space.

                                        During the backup, I deleted some small files to help make space and emptied my trash. Then suddenly, viola! My free space came back. My trash had been emptied before, so that was not the cause. Perhaps my Entourage address book had become corrupted? This was one of the files I deleted to help create space since I do not use Entourage at all. My boyfriend thinks perhaps Time Machine itself created some weird temp file that got out of control and that it disappeared as Time Machine finished doing its thing. I do not know what happened, but in the process my boyfriend and I found two great FREE tools that we have now decided everyone should have.

                                        1) Disk Inventory X

                                        This awesome program will show all attached drives, the space used and available, and gives you a really nifty graphical interface so you can immediately see which files are space-hoggers. It takes a little time to get going if you decide to view your entire computer, but trust me, it is worth it!

                                        2) TimeTracker

                                        This utility will list all your Time Machine backups, the space they are using, and a complete breakdown of what has changed since the previous backup.

                                        Saturday, April 10, 2010

                                        New CCOM-er, New Blogger

                                        Jonathan Beaudoin, from the University of New Brunswick, just recently joined us at CCOM. His work on looking at the uncertainty of 4-dimensional sound speed in the water column is pretty fascinating!

                                        Jonathan is also a blogger and Mac user, so I thought I would give his blog a shout-out. You can find it here: Jonathan's Blog.

                                        I must admit though, I am not sure how I feel about him now that I saw his knock on EMACS. I use Aquamacs on my Mac and I love it!

                                        Thursday, April 1, 2010

                                        Can Toads Be Used to Predict Earthquakes?

                                        There is a long history of animals acting strangely prior to an earthquake. The Japanese believe that the catfish may be used as a predictor of seismic activity. Dogs, cats, chickens, and horses are also believed by some to be able to detect p-wave energy well before humans can. Now it appears that toads may be added to that list:

                                        Can Toads Be Used to Predict Earthquakes?

                                        Thursday, March 11, 2010

                                        Want to go to the Arctic? Just donate 25K to the La Jolla Symphony

                                        Check out this article:

                                        Dr. James Swift, from the Scripps Institute of Oceanography, is offering up the chance to go on the Coast Guard icebreaker, the Healy, for a research cruise up in the Arctic. All you have to do is donate 25,000 smackaroos or more to the La Jolla Symphony, and you get entered in to the drawing.

                                        Sure glad I study the ocean; I got to go for free :)

                                        Tuesday, March 9, 2010

                                        R.I.P. ABE

                                        This past Friday, March 5, 2010, the AUV ABE was lost during a dive off the coast of Chile. Here is the formal WHOI obituary, I mean announcement, and here is an nice exert highlighting some of ABE's many accomplishments during his life:

                                        Built as a prototype, ABE quickly became a workhorse. It was the first autonomous robot to make detailed maps of mid-ocean ridges, the 40,000-mile undersea volcanic mountain chain at the boundaries of Earth’s tectonic plates where new seafloor crust is created. It was also the first AUV to locate hydrothermal vents, where hot chemical-rich fluids spew from the seafloor and sustain lush communities of deep-sea life. ABE explored seamounts, undersea volcanoes, and other areas with harsh, rugged terrain. In addition to researchers and students from the United States, ABE advanced research for scientists and engineers from Canada, the United Kingdom, Germany, Australia, New Zealand, Japan, China, Italy, Ecuador, and most recently Chile.

                                        Friday, March 5, 2010

                                        Chilean Earthquake Shortens Earth Days

                                        Scientists now estimate that our Earth days are ~ 1.26 millionth of a second shorter. Here is a nifty blog post explaining why.

                                        Wednesday, February 17, 2010

                                        Look Ma, I Created an Anaglyph!

                                        I just stumbled across a pretty sweet Matlab-framework tool called Mirone that allows you to do all sorts of analyses to various grid/image formats.

                                        If you do not have Matlab, there is a standalone version that runs on Windows. For Mac folks, it will run under Snow Leopard if you have Matlab 2009, but sadly does not seem to work in Matlab 2007 on 10.5.

                                        I am currently running it on my Windows desktop at school using Matlab 2009b and I am pretty impressed. It has a bunch of image processing/geophysical tools at its disposal. In just a few minutes of playing, I was able to open a GMT grid and plot plate boundaries and earthquake epicenters on it.

                                        What mainly got me excited about it was that I was also able to generate a pretty sweet anaglyph. If you have a pair of red-blue glasses, give it a looksie! The area is the Carlsberg Ridge in the Indian Ocean. I recommend double-clicking the image to open it in full-size.

                                        Friday, February 12, 2010

                                        Maybe I should work at Google...

                                        I love my job. Well, more precisely, I love being a graduate student in oceanography. I get to do amazingly cool research, and get to go to amazingly cool places as well (even the Arctic! - no pun intended). However, every now-and-then I come across something that causes me to daydream about what it would be like to be in a different field. Today's daydream came courtesy of Google and their Google Lat/Long blog.

                                        Recently they rigged up a Google "Street View" snow mobile to give us a true 3d look at the slopes for the upcoming Winter Olympics. People actually got paid to snow mobile around the slopes for a day!!!

                                        Street View Hits the Slopes at Whistler

                                        Evidence confirms mud-volcano was man-made

                                        Mud Volcano Was Man-Made, New Evidence Confirms

                                        This article discusses the findings of Richard Davies, director of the Durham Energy Institute and co-author of a new paper in the journal Marine and Petroleum Geology. The results of his analysis is that the Lusi mud volcano in Sidoarjo, Indonesia was indeed caused by the drilling company when they pulled their drill out from an unstable hole. Since its first eruption May 29, 2006, Lusi has dumped out 100,000 tons of mud a day. It now covers almost 3 square miles to a depth of 65 feet, and displaced thirty thousand people. The drilling company maintains that an earthquake that occurred 175 miles away on May 27, 2006 is the real cause of the mud volcano, although there is plenty evidence to the contrary.

                                        One of the things I find interesting about this story is that both scientists working for the drill company and independent scientists have papers about this very issue in the journal of Marine and Petroleum Geology. The scientists working for the company all find that the real cause of the volcano was the earthquake, which in turn means that their company should not be held financially responsible. The independent scientists all find flaws with that reasoning and claim that the earthquake could not have been the cause. Clearly, the drill company scientists have every reason to be biased. If their results indicate the drill company was at all responsible, not only would they probably be out of a job, but they would be setting the company up for a huge financial burden. So how can we as a reader, or the journal for that matter, look at this article objectively? It is hard to see it as just another scientific exploration and discussion of a dataset. Furthermore, I am curious if this article mentions that the authors work for the company in question. Surely that disclosure should be made somewhere where the reader can easily find it, shouldn't it?

                                        Tuesday, January 26, 2010

                                        Geodesic Distance in ArcMap

                                        One of the things I like about ESRI ArcMap is the ease with which you can measure features in your data. Recently I wanted a a simple way to measure geodesic distances of some projected data. The measure tool in ArcMap, by default, measures distances in projected units in projected data, and geodesic distances only for data displayed in geographic lat/long. If you hold down the shift key while measuring a distance, Arc will calculate geodesic distances regardless of projection. What I really wanted to be able to do, however, was to be able to draw lines in a shapefile, and then have Arc calculate the geodesic distance of those lines. This is something that Arc cannot do, without you first unprojecting (or more correctly, reprojecting) your data to geographic coordinates.

                                        Luckily, I found this:

                                        Looks like some of the folks at ESRI got together with the USGS a designed a plug-in that will let you, among other things, calculate geodesic distances from projected data. It uses the parameters specified by your projection to unproject the data into geographic coordinates and then calculates the geodesic distance on the fly. Since it uses the semi-major and semi-minor radii that you specify in your projection, the distortion should be minimized. This tool is definitely a time saver, as it does the reprojection for you only on the data you are calculating, and enables you to keep your whole project in projected space.

                                        Thursday, January 21, 2010

                                        ArcMap Projection double-check

                                        I recently posted about how I had written a python script to convert Smith/Sandwell topography data to an ARC ASCII grid. I brought my resulting data in Arc and everything appeared to be correct. It all seemed to be going quite swimmingly; until that is, I noticed that the Prime Meridian appeared to pass just off the eastern coast of Australia. Hmmmm.. Last I checked, the Prime Meridian was still in Greenwich, UK. Clearly something was wrong. In my grid image, Greenwich fell on the very edge of my map. It seems that while I can bring the data into Arc and tell it what the projection is, Arc has a hard time displaying correct Lat/Long coordinates if you range of from 0 ->360 rather than -180 -> 180. There does not seem to be a way to specify that your longitude range is 0 -> 360 versus -180 -> 180 in Arc. I find this strange and am wondering if I am simply just missing it. Anyway, since I could not figure how to make it right in Arc, I went back to my original python code.

                                        I decided to make my grid actually go from -180 -> 180 by amending my code. I basically grabbed the western half of my grid and appended it to the beginning of my file. I then cut off the redundant data. Using Struct.unpack to decode binary data results in a tuple, which cannot be modified, so the first step is to convert to a
                                        #now unpack img data write out the ASCII file data
                                        for j in range(nrow):
                                        if v:
                                        if j% 500 == 0:
                                        print 'row:', j #print row # every 500 rows
                                        raw_data =*ncol)
                                        row_tuple = struct.unpack('>'+str(ncol)+'h', raw_data)
                                        #now move western half of data to the east
                                        row = list(row_tuple)
                                        east = row[ncol/2:ncol]
                                        corrected_row = row[0:ncol]

                                        I had to reverse the rows before I could append the data, because the extend command places the data to be appended at the end of the row only. Once the data is correctly added, I simply reverse it back and snip off the extra. I could also just as easily grabbed the first half of the row instead (east = row[0:ncol/2]) and avoided the reversals, but this just happened to be how I worked through it first.

                                        UPDATE: I showed Kurt my code last night and pointed out how I went about switching up the columns in my row variable. He said it was good that I figured out the two long ways, and then he showed me the one line method. Sure, you cannot really modify tuples after they have been created (e.g. no tuple.append or tuple.extend) but you can grab sections of them and switch them around like so:
                                        In [1]: t = (1.2,3.4,5.6,7.8,9.1,10.2)
                                        In [2]: t = t[3:] + t[:3]
                                        In [3]: t

                                        Applying this method to my code, I can switch up the western and eastern halves of the line I read in from the img file simply by:
                                        #now move western half of data to the east
                                        row = row[ncol/2:] + row[:ncol/2]

                                        Wednesday, January 20, 2010

                                        Pylint: The Python Code bug/quality checker

                                        Now that I am off and running in Python, I thought I should install Pylint. Pylint will review your code for bugs and quality (as determined by some formalized Python standards).

                                        You can run it via command line simply by typing:


                                        There are also a lot of options for silencing reports, getting more detail from messages, etc. If you leave the reports on, you get a global evaluation score. I just ran my script the decodes a binary SIO IMG file into an ARC ASCII grid through pylint and got this result:

                                        Global evaluation
                                        Your code has been rated at 4.92/10

                                        Not bad, considering this is my first ever script. My code runs fine; the majority of the messages involved formatting issues, such as some of my lines being too long. For code readability, the convention is to keep lines as short as possible. I definitely have some that I could break up into 2 lines. I also get some messages about not following my commas with a space (picky, picky).

                                        After shortening just a couple lines and putting in some spaces after some commas, I ran it again:

                                        Global evaluation
                                        Your code has been rated at 6.44/10 (previous run: 4.92/10)

                                        It is still complaining about some line lengths and the fact that I have some variables named with single letters, but in general my code is pretty good. If you want to help make your code better, especially if you are just learning like I am or if you plan to distribute your code, I recommend Pylint.

                                        Leaving the shallows...

                                        So I recently have experienced a huge change in my research direction, and for those possibly going through the same thing, I thought I would share the hows and whys.

                                        Last year, my original research project began taking a serious nose-dive. I knew what I wanted to do and thought I had a good plan of attack, but I simply could not account for some variables. Amongst other issues, there was some crucial information I was going to need to make it work, and I was not going to be able to get it. Lesson learned: beware projects involving potential proprietary information. However, if someone wants to give me hundreds of thousands of dollars so I can buy my own plane, lidar system, and hire a pilot, I may still be able to do it though...

                                        Anyway, after a couple months of feeling like I was drowning in trying to find another project, I was getting scared. I was nearing the start of my third year and I had no project. Sure, I had like 45 credits worth of class that I had taken, but my research had basically gone up in smoke. I got to the point were I was considering taking a leave from school to give myself time to decide what I wanted to do and if I even wanted to continue in grad school. The best advice I got was to talk to some of the faculty outside my department, particularly young, fresh-faced, faculty that were still full of excitement. I did so, and it saved me.

                                        Now, I have finally came across a project that I think is a perfect fit. It brings me back to my undergrad days of awe when I was first learning about oceanography and geology and everything amazed me. I am leaving the shallows and heading into the deep! My new research focuses (in a nutshell) on using high-resolution multibeam sonar data to determine fault structure and earthquake potential along oceanic transform faults. I am so excited. My committee is also very excited and very supportive. This is key, as last year this was not always the case.

                                        I am sharing this because I think it is important for other grad students (especially the PhD-ers out there) to know that sometimes setbacks happen, and sometimes they are for the best. This new project definitely has me more excited than my old. There is much more room for collaboration, and I am going to get to work with some great folks down at Woods Hole! My committee changed slightly as a result of the new direction, and is more excited and supportive than ever before. Did I lose some time? Probably about 6 months of research time. Does it matter? No. In the long run, who cares? As one of my committee members said to me, "the setback itself matters not, what does matter is whether or not you bounce back." Setbacks in grad school are common. They happen to the best of us. After all, we are still learning how to ask the questions that lead to great projects in the first place.

                                        So 2010 certainly has been a fresh start for me. A new year, a new project, and now, a new blog layout.

                                        Sunday, January 17, 2010

                                        Converting SIO img file format to Arc ASCII grid w/ Python

                                        Recently I have been battling with getting a SIO binary file format of predicted bathymetery (from Smith/Sandwell satellite topography data) into an Arc-friendly ASCII grid format. GMT has a nifty img2grd command which generates a netcdf of the img file; however, it takes some manual tweaking afterwards to get the coordinate bounds of the data to be Arc-happy (even if one follows img2grd grd2xyz with the -E option). I have decided I want a one-stop solution, where I could just feed in the img file and spit out an ARC ASCII grid and this means I need to write my own script. Therefore, I sat myself down this afternoon and began to teach myself Python (with some input from Kurt). Now, after just a couple hours of Googling, tweaking, and testing, I have a working Python script that does just what I need it to. It reads in an img file, and spits out a space-delimited ASCII grid file complete with the ARC header.

                                        In order to test that I was decoding the binary properly, I wrote the first row of data out to its own little text file and used gnuplot to graph it up.

                                        in my script I have the following:

                                        for i in range(1):
                                        row = struct.unpack('>'+str(ncol)+'h',*ncol))
                                        for depth in row:
                                        print "\n"

                                        then in terminal I call gnuplot and at the prompt type:

                                        plot 'filename' with l

                                        Certainly looks like a nice depth profile to me:

                                        Sunday, January 3, 2010

                                        Org-mode for note organization

                                        New Year: new attempt at note organization

                                        I am always trying to find the best way to organize my random work notes and tips that I tend to collect in various notebooks, post-its, etc. I started keeping a text file last year, which was rather plain-Jane, but did the trick. Kurt, however, just introduced me to org-mode in Emacs, and I love it! Org-mode is extremely powerful. You can insert date stamps, create tags and link across multiple entries, create levels and sublevels and fold/unfold sections, highlight code blocks according to the scripting language, etc. Included html links are active from the get-go, and you can even include equations, graphics, and tables. One of the best parts is that you can export to HTML, PDF (via LaTeX), and DocBook.

                                        Here is a sample of what my notes look like in Aquamacs using org-mode. I have organized my notes by program/tool, but you could also organize by date and use org-mode to keep a daily log. On the org-mode website, you'll see tons of examples. You can also use org-mode as a task scheduler and day planner. The possibilities seem endless.

                                        And here is the resulting HTML file (ignore the old dates, I imported my old notes from last year):