So I started the day by creating a profile on oDesk. Friends on Facebook had insisted I would have a much better chance of finding work on oDesk than Elance, as it had more of a tech audience and I wouldn’t be undercut by 12 year old sweat-shop workers in India. Or something along those lines.
The profile creation process was longer for oDesk, mainly because you couldn’t import CV detail from LinkedIn, so you basically had to
make a whole lot of stuff up recreate your entire CV from memory.
After uploading some portfolio items on oDesk, it was time to return to Elance in order to take some skills tests. I use the phrase “skills tests” loosely. I would neither class these quizzes as *tests* or involving any *skills* whatsoever. Unless you count the ability to slam your head repeatedly into your keyboard.
I started with the SEO “skills test” (note the air quotes). During the process, I became more and more frustrated. The majority of SEO questions were about the robots.txt file, which I found strange. Others were super outdated, often because recent algorithm changes made the question redundant or they referred to product names/services that no longer exist e.g. Google Places. The more questions I took, the more ambiguous or redundant the questions became, meaning I had to guess what the author meant rather than rely on my 18 years of solid SEO knowledge. Finally,
I got tired of slamming my head into the keyboard there were no more questions in the pool so I had no chance of improving my score. Wonderful.
Next up was the AdWords “skills test”. This had similar issues to the SEO test, with ambiguous or simply outdated questions. Take for example the question *Can you use the same keywords in different AdGroups? Yes or No?*. Although I guessed the answer correctly as *yes*, the question needs to be re-worded. Any good advertiser knows that although it IS possible, you should never use the same keywords in multiple AdGroups, because Google won’t know which ad you meant to show and will just pick the one with the highest AdRank. This means you are effectively competing with yourself and driving up the CPC.
So for clarity, the question should ask “Is it possible” rather than “Can you”. (Reading that last paragraph back, I sound like a complete wanker, but other
wankers digital marketing folk will no doubt be nodding their heads ).
The Google Analytics test was mostly straight-forward, but there was a focus on only one or two aspects of Analytics. Which seems kind of ridiculous when you consider that Avinash Kaushik’s book on Web Analytics is 500 pages long and weighs a kilo.
Judging by some of the feedback comments, I wasn’t the only person who found this frustrating. Last of all, I took the Search Engine Marketing skills test. Again, many of the questions (or answers) were either irrelevant, redundant or outdated. Quite a few were subjective or written in a way that was too ambiguous.
I left quite a lot of feedback on the questions, but of course this ate into the time I took answering each test, which lowered my overall score and increased my stabbiness and doughnut intake. I think this is a bug inherent in the Elance system, but something that could be easily improved. Perhaps they should find a way to allow test-takers to tag a problem question and provide feedback at the end, after test time has been recorded. Or perhaps send each applicant a box of doughnuts by way of apology?
As you can see by the attached screen-shots, I wasn’t the only one unhappy with the test quality. I particularly enjoyed Elisa’s comment “Are you proud to write stupid questions?”.
Having such a lot of negative feedback clearly viewable by test-takers reflects badly on Elance. I’m not sure how often the feedback is reviewed, but to the public, it looks as though the feedback is being ignored completely.
I wonder if oDesk will have the same issues? I’d better buy more doughnuts.
Previous posts in this series: