Over several years, while fully employed, some friends and I built a startup with whatever time we could scrape together. ProjectSHERPA was an imagining of what the tech recruiting marketplace could be – a data-enabled web platform for finding, screening, and recruiting entry-level technology talent. Virtually everyone in the space agrees that finding and recruiting technologists is difficult. Job posts are awful, job search boards are awful, job recommenders are awful, agency recruiters are awful, and for the most part, company hiring practices are awful. We wanted to take on each of these, using software.
ProjectSHERPA was Born
We closed up shop in 2015 due to lack of traction and ultimately lack of focus; over the years, however, we built a lot of cool stuff, memorialized here.
Our revenue model was contingency based: for any hire made through our platform, we would receive a percentage of that hire’s first year starting salary, paid to us by the company. In order to get to this point, we needed clients; to get clients, we needed a pool of candidates. It was a chicken-and-egg scenario, and we brainstormed ways to get candidates into our channel to kickstart the process.
Scraped Jobs = Free Content
Our idea was to provide content in the form of better job posts: we would scrape job boards for current opportunities, and convert those opportunities into better versions of themselves. Over the course of months, we scraped and analyzed hundreds of thousands of tech jobs. For each scraped job, we did a few things:
- We identified the company and role. I know this sounds simple, but it’s an important and somewhat complex first step. Given we were scraping several job boards, we needed to be able to de-duplicate listings. Often the same job is posted on multiple boards, and one of our goals was to unify such cases. We excluded any role for which the company was not explicitly provided.
We also excluded any role posted by a recruiter or recruiting agency. Over time we built up a list of thousands of staffing firms, and we dropped any job posted by any of them. Our goal was to help companies hire, not to help fill recruiter coffers. - We used APIs (and some scraping) to pull supplementary data about the company: funding details from Crunchbase, stock charts from Yahoo, social media links to every available presence, public repos on GitHub, office locations from Google, company descriptions from LinkedIn, and tons more. We wanted our job posts to show all of the things that would be useful to a prospective candidate when considering whether or not to apply.
- We identified which programming languages and libraries were to be used on the job. We built up a massive taxonomy of tech frameworks and could identify, for example, that Java was different from Javascript which was different from JScript. Further, we were able to see through all of the many misspellings seen throughout the plethora of job posts. “Engineer” can in fact be spelled in many other ways, apparently.
- We determined the seniority of the job, labeling jobs as junior, senior, or management.
- We estimated the salary of the job. Very few job posts include an explicit salary. We built a model that considered location, tech stack, seniority, industry, and a few other components to calculate a salary. Over time, the model learned from itself and eventually became fairly accurate.
- Using NLP, we summarized job and company descriptions.
- For startups, we estimated when they would be fundraising again and for how much.
We slapped a search engine on top of the above data set, and published SHERPA to the world. Undoubtedly, our version of these job posts was better; unfortunately, we never stopped to consider whether simply building a great product would be enough on its own to attract users. We tried marketing and getting the word out over social media, but that’s not easily done part-time.
Our job search board resulted in 0 hires, so we pivoted.
Maybe an ATS?
Our next idea was an applicant tracking system (ATS). Many companies use an ATS to manage their pipeline of talent: it tracks who is interviewing and with whom; who has been granted an offer; feedback from interviewers about the candidate; and shows analytics about the pipeline and success rates. Perhaps an ATS integrated with our job search board would entice companies to post jobs directly with us. So, we built this. Going after candidates first didn’t work; maybe we would have success going after companies.
Our ATS had a very slick UX, and the ATS workflow was entirely customizable for the client (now standard fare). Whether the ATS was a selling point or not, we began to see a few successes – a few companies posted their jobs directly with us, and we massaged candidates through our platform and into roles.
We tried some content marketing: http://www.projectsherpa.com/blog/. Writing well and often is hard though! We continued to struggle to get enough users to justify our efforts. We were continuing to operate under the mantra of ‘if you build it [they] will come’.
RateMy(Shitty)Recruiter.com
Clearly, we were failing on marketing. We reached into our network and came up with an idea, a game-y site that had viral potential. RateMyShittyRecruiter.com – it sort of speaks for itself. We toned it down to RateMyRecruiter.com after getting some feedback from friends.
Personally, I think RateMyRecruiter was a fun idea with potential. It still required marketing to get the word out, so while it may have been another link to Sherpa, it wasn’t a solution unto itself.
Concierge: A White-Labeled Service
Unfortunately, by 2014 we still weren’t seeing enough volume to justify going full-time. SHERPA was feeling more like a hobby than anything else. We decided to try out one more product, a concierge service.
Candidates would create a profile with us – our candidate profiles would be waaay better than anything on LinkedIn – and then we would give companies access to search our candidate pool as a SaaS model.
So, we built this new product, and it was good. It allowed candidates to unify their entire web presence into one place, added some free text responses, and gave companies legit insight into our pool. The design was clean and the workflow straightforward.
Companies liked the product, and we had hundreds of candidates sign up. But then interest tapered off after the initial interest. Perhaps we gave up on it after having built so much without any big wins; but either way, we called it quits.
Lessons?
I learned bucket loads throughout Sherpa. Tech-wise, everything was new: Python and Django, web-scraping (Scrapy), Angular, AWS, social media APIs, and elements of data science.
Along the way, we met with numerous clients and potential investors. We competed at a few venture capital competitions and saw some modest successes. In so doing, I learned about the VC landscape, about the sort of research and art that goes into a pitch and deck (one version of ours is here: https://www.craigperler.com/blog/wp-content/uploads/2012/04/Sherpa.pdf). We received a lot of tempered praise. I’ve learned what that really means.
Our efforts and downfall reflect the importance of early product validation. We were thoughtful and focused, but didn’t talk to our users soon enough. We rushed to build because we thought we had the next great idea. Perhaps we did have a great idea – this space has become increasingly more crowded; and I’d like to believe we did a solid job executing on product. We didn’t stop to think, however, about how to get the product to market, how we would fill the funnels, or how to get the machine to start churning. With each pivot, we learned more about building product; only in retrospect is it clear that building product alone makes not a business.
1 comment
Comments are closed.