Monthly Archives: June 2007

Out of the cradle and into a beta- Bioscreencast.com

I am very excited to announce the coming to fruition of a project that five of us have been working on for the last few months.Its a site based on screencasts called Bioscreencast.com. You can read more about the site on our bioscreencast blog post and at my Omics world blog.

The entire site was coded into life by one person, our head web geek and javascript junkie, Suresh.

As a wannabe coder, I came away amazed at the sheer power of the many open source libraries out there, the robustness of mysql databases, the sheer elegance of css..the swiss army knife like ffmpeg , the clumsiness of php, the list goes on. What made the whole process doubly enjoyable was that all five of us are relative web newbies.. and learning how these things work along the way was a lot of fun.

Watching Suresh work his web magic has made me want to learn more of the six technologies I want to master, and I have also added a few more to the list( more on this in future posts)

Just thought Id give my plug for Bioscreencast.com. I hope the life scientists out there ,  like the site,  and all of you will keep your feedback coming.

Links:

The Bioscreencast website

The Bioscreencast Blog

One of our co-conspirator Deepaks intro post

The entire site was coded and crafted by one person, our head geek Suresh .

reCAPTCHA and help the digital library project one word at a time

Adam Weiss on Bostons museum of science podcast recently interviewed Luis von Ahn one of the people behind CAPTCHAs those squiggly lines and puzzles that you solve every time you sign onto a website or post a comment to your blog to prove that you are a human.
They are basically designed so that computers cannot easily decript the message whereas the human brain can.

Yesterday at the Berkman thursday blog group meeting Adam spoke about how Luis Von Ahn and others have a new take on using CAPTCHAs called reCAPTCHAs. Apparently, the amount of time spent in solving these puzzles amounts to about 150,000 hours daily . So Von Ahn and his group at Carnegie Mellon figured a good way to put all this time to good use was to help the digital library project. What reCAPTCHA basically does is to collect all the words that fail to be recognized by the optical character recognition ( basically the computer algorithms that convert an image to text) from the digital library project and use them to authenticate users.

So now you are presented with two words, one that CAPTCHA knows the answer for and the other that is part of the reCAPTCHA database. So thanks to your being human you solve both words correctly and contribute one more word to the digital library projects book digitization effort.

Refs:
Adams Podcast interview
CAPTCHA on Wikipedia
About reCAPTCHA
Adam Weiss helps you get podcasting

The above image is a link to the reCAPTCHA website

Powered by ScribeFire.

Get podcasting with Adam Weiss’s help

Yesterday at the Berkman Thursday blog group meeting. Adam Weis from Bostons museum of science gave a talk about “all the cool and amazing things he does” and his experiences with podcasting, blogging, and other things web and digital.

Adam hosts and produces Boston’s museum of science podcasts and also his own Boston behind the scenes podcast. A veteran podcaster ( his science podcast is almost 100 episodes old) he also serves as a podcast consultant. Adam brought along all his podcasting gear and played us some of the samples from his website.

The session was a big eye-opener , I was amazed at how simple his equipment was, and impressed at the very professional results it produced. Also I was gladdened by his evangelical zeal and desire to make podcasting more accessible. I recommend his podcast consultant site for anyone looking to get started with podcasting.

The image above is a link from Adam Weiss’s equipment guide to podcasting and features the iRiver iFP-799 equipped with a $15 Giant Squid Audio Lab Mini Gold-Plated Omni Mic which is what he uses for most of his interviews.

Photosynth-seadragon and single particle imaging

As a a graduate student I had worked on a project where I used single particle imaging techniques to image the structure of a small viral protein. The protein particle fortunately has some symmetry, and using single particle image reconstruction techniques I could obtain a three-dimensional model of the particle from two dimensional projection images taken on an electron microscope.

After deepak got me hooked on to the TED talks , I caught a talk by Blaise Aguera Y Arcas on Microsofts new application called photosynth.

In the talk Blaise Arcas describes how they were able to put together a very high resolution almost three dimensional composite of the Notre Dame Cathedral assembled from tagged images on flickr.

Their software was able to accurately find the register for thousands of images from this tagged set and assemble it into the final composite. Check out the video above to get an appreciation of the complexity of the application. While I am hardly an expert in image processing, the algorithmic complexity of the application boggles my mind. Particularly impressive are the sections in the video where he talks about photosynth finding the register of images in the actual assembled composite despite them having people , hands and other obstacles obscuring the view of the cathedral.

I also caught some of the discussion on microsofts channel 9 on the technology. I sure would like to know the concepts they used to put-together such an amazing app. I also wonder of any of these concepts can help improve image reconstruction techniques in use in the single particle bio-imaging field.

Ignite Boston- loved the format.

Just got back from the first ignite Boston event organized by O’Reilly. Met a lot of really cool people and heard a few good talks.

It was just strange that quite a few of the talks ended up as blatant sales pitches for new companies or were very partially disguised so they didnt sound like sales pitches. Regardless..

The catchiest demo was the one on buzzword, a fully WYSIWYG flash based word processing app that ran inside the browser from a Boston based startup, Virtual Ubiquity. Rick Treitman the CEO demoed the app which is still pre beta. The user experience it promised was truly breathtaking. Buzzword made google docs looks like your grandpas word processing app.

Matt Welsh gave a cool talk on wiring cambridge with 100 sensors connected by a 802.11 wireless grid . The NSF and Microsoft funded city sense project would open all these sensors to the public and make the sensors programmable and make their data available to the general public. ALong these lines Brian Jepson gave a good talk about blanketing Rhode Island with wifi and the cool things you could do with the Make magazine electronics kit and the RI-WIns network.

The talk by Matt Douglas was about marketing and how to go about publicizing a new web or computer startup. Woven into his talk was a plug for Punchbowl.com which touts itself as a replacement to evite.

Though I am hardly a gamer, the talk by Jason McIntosh from volity was interesting in its approach to bringing open source components into the online gaming world.

Among the general interest and non sales pitch oriented talks ,Rod Begbie gave a fun talk about good presentations and Chris Brogans talk on using social networking to actually network effectively, was very nicely done.

Greg Raiz from Raizlabs and PicMe a photo sharing site also gave a very nice talk about his websites approach to data organization , His theory was that perfect organization is not always what we want ( or should want) .

The event was held in a crowded room where it was almost impossible to hear anything going on on stage if you were in the back.

On the whole the event was a lot of fun and I cannot wait for Ignite 2- Boston.