In trying to break away from large pools of applicants for jobs, I'd love to be able to use skills tests more effectively, but there are problems
1) I have major doubts about their accuracy. I'm a native speaker and instructor of advanced English, apart from being an experienced editor. Sometimes I take the tests and am 100%, absolutely certain of every single answer, but then my score shows that I got some wrong. There is no way for me to see what I supposedly got wrong. I've complained about this over and over and no one seems to care.
2) Despite the inaccuracies in the tests, I do very well on them. When I calculate my own rank based on my placement out of total test takers, I come in at things like top .003%. However, the highest my profile can show is top 10%. Well that means there are hundreds of people who took that test and scored below me. That's a major difference.
This is an issue for clients as well as freelancers. I've had clients tell me that the person they hired before did a bad job. A more stratified percentage ranking for skills tests would hel a lot with that. On one test I scored 24th out of 11,038 (although I'm convinced I should have had a perfect score), and on my profile it just says "Top 10%." So there is no way for a client to see the difference between me and the 1100 people (ONE THOUSAND, ONE HUNDRED PEOPLE) who scored lower than me. That is a huge, huge difference that the client can't see.
And, by the way, the creative writing test for nonfiction is a complete disaster. There are questions that are pure opinion, questions that have no bearing on one's creative writing skiils, and lots of questions about playwriting, screenwriting, theater in general, and the film industry. ????????
I've done a few of the tests, done OK in them, but I feel they are not a true reflection, or even an indicator of a persons ability. For example, I have a better score in U.S. English than in UK English and I am born, bred and educated in the UK.
My specialist subject area doesn't even have a test (well there's a vague one that is geared towards programming, but I'm not a programmer). In fact many specialists would struggle to prove their ability via one of these 'tests'. They are of little use when recruiting, I certainly know that I wouldn't rely on them to recruit (and in my day job I have recruited many people over the years).
The most important thing for a client to see is sample work and good feedback (the JSS score is a joke, far too many variables that are open to 'fuzziness' in its calculation).
I work part-time as a freelance video game journalist and have been doing so for a year now come February 2016.
I've written close to 100 news articles and haven't had any complaints about my work from the sites I've written for (haven't found any work on Upwork as of yet), yet I scored "below average" in Upwork's Online Article Writing and Blogging Test (UK Version) , which will probably turn people away if I made it public. It's not exactly a journalism skills test, but it's the closest they have to offer.
As a whole, multiple choice tests aren't a reliable way to actually test someone's skills and knowledge.
Just to throw my two cents into a hat, I have to say that ALL finance tests on Elance are complete rubbish. For example, I've scored 4.9 out of 5 on "Financial forecasting test". As a guy who did financial forecasting for the last 15 years and currently do it for a living, I have to say that the test has nothing to do with financial forecasting. Furthermore, plenty of questions do not have right answers or have multiple correct answers.
Same true for financial analysis test and accounting skills test, though I have to say I did not score as well there, partially because I did not try as hard to reverse engineer pervert logic to the test authors.
I also did take tests on Excel, and I have to say that most of the questions there were completely irrelevant to the work I typically do, though I only use Excel for a few specific purposes (such as data mining and financial modelling). So I can imagine that the questions could be more relevant to some other Excel users.
Finally, the way English tests are structured only encourages cheating - time to answer questions on vocabulary is sufficiently long to search for a specific word on the Internet.
I would strongly advice UpWork to hire freelancers with both domain knowledge and experience in writing tests to completely rewrite all the tests. I will be more than happy to review them if needed.
Yeah, I thought the same of the SEO test I took. It was not done by someone who knows SEO. I felt like I was answering questions on hype and not real search engine optimization.
Upwork neither write, nor design, nor have any real influence over those skills tests.
Just like every other platform Upwork uses 3rd party providers; Upwork uses Expertrating.
The one Elance used is just as bad, and some of the tests were actually identical, mistakes included.
Petra, fully agree with you. I actually flagged this issue on Elance some time ago as well. I never bothered to take any tests used by other freelancer platforms, but I suspect that they are no better.
At the same I strongly believe that it is Upwork responsibility to clients to choose tests that reflect freelancers' knowledge and experience. And if they cannot find such tests, not to offer tests at all. Obviously high quality tests would be much more expensive, but at least Upwork would not be indirectly endorsing something that is very close to outright fraud and force freelancers to indirectly endorse those tests as well just to keep up with the competition. At least on Upwork tests are free, so potential clients have an option to check them out.
You're absolutely right that the tests need improvement. I'm in graphic design, and the tests for design programs (Photoshop, InDesign, Illustrator) feature questions about arcane areas of the program and things that 90% of designers aren't using for everyday work. Also the newest tests for Adobe programs are version CS4, which came out in 2008!
I can only agree that a lot of the tests in my area of expertise (web development/programming) are pretty horrible. Here are some points:
a) outdated and irrelevant questions (CSS Test has questions related to aural stylesheets, http://www.w3.org/TR/CSS2/aural.html ; no one has ever used those, just read the first sentence of the spec lol)
b) the questions often times make no sense at all
c) questions are written in a language that sometimes barely resembles English (I am not a native speaker, but I suppose you normally use stuff like articles and prepositions in English)
Exactly!!! This is EXACTLY what I'm trying to figure out now...why my scores are never higher than 4.75/top 10%, when I can't figure out what the heck I could have gotten wrong. I don't mean to sound conceited, but I really would like to know what I missed.
I've taken three language tests (English grammar, English vocab, and Italian-English translation). I'd love to know what I missed, but can't find out. A few questions were strangely/poorly written, but I flew through all three with no trouble and was confused about what I missed. I got 4.75 on each (decent, not great), and would really love to know what I missed and how to learn from my mistakes (if they were mine and not the tests'..I have my doubts). Any chance to find my tests and see where I went wrong?
Please tell me if you resolve this.
Learn more about our awesome Community member Lisa!Learn More
Virtual Talent Bench enables you to easily discover and connect with talent. Learn more about building custom lists of talent, adding tags, notes, and more to move your business forward.Learn More
Loom addition in messages provides more ways to easily communicate and share information on Upwork!Learn More
Beginning Your Freelancing Journey on UpworkLearn More