Hacker News new | past | comments | ask | show | jobs | submit login

here, have the first step:

curl -s http://www.crunchbase.com/search/advanced/people/1250638 | grep advanced_search_query | cut -d\" -f12




Likely much faster ways of going through this but to expand on your thought:

#!/bin/bash

     for (( i=1; $i < 1250646; i++))

     do

                curl -s http://www.crunchbase.com/search/advanced/people/$i | grep advanced_search_query | cut -d\" -f12 >> people.txt

     done




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: