Nov 21, 2010 04:17:17 PMEditedOct 30, 2014 02:28:19 PMbyMichael B
[quote=kugrin]I think most of tests are fair, BUT the translation tests are horrible. (In particular the German/English and English/German translation tests.) Please fix this!! The tests were NOT written by a native speaker and are full of grammatical mistakes. I find it very difficult to pick a correct answer out of 4-6 wrong answers. Many people have complained about this before, please have 2-3 native speakers proofread them to ensure quality AND ACCURACY. Thanks![/quote]
Yeah, same problem here, and it even reflected on my result, because even the supposed "correct" answer was wrong German... Oh, yes, I apply herewith for the proofreading
Nov 7, 2010 08:17:16 PMEditedOct 30, 2014 02:28:16 PMbyExp U
I know this is an older thread now, but I wanted to have more experience with the oDesk tests before responding. After taking a number of the tests, I have some feedback on them:
First, the concept is great. It's very helpful to have an additional method for contractors to back up the claims made in the skills section. However, the specific test bank and/or testing service used by oDesk to provide the tests suffers from some deficiencies.
1) Too much duplication of questions between different technology tests. For example, the tests on "SQL", "SQL Server 2000", "SQL Server 2005" and "MySQL 5.0" have a relatively large number of the same questions. So a tester taking all four should score a bit higher than otherwise expected in later tests. This is obviously subject to being gamed. A tester whose goal is to score high in, say SQL Server 2005, could simply take all the other SQL-related tests first to get a good preview of much of the target test.
2) Too much tunnel-vision in the soft-skills test. I have noticed this in the language, management and systems analysis tests. The tests are written as if all the questions are generated from a single textbook. They may test OK for one single standard approach to the subject, but often there is not a SINGLE standard approach. If the tester is lucky enough to have learned and practiced according to the specific standard adopted by the test, then all is well and good. But an otherwise completely qualified tester who has learned and practiced according to a competing standard will not do well, simply because the terminology is different.
Hope this feedback is still useful at this late date.
Nov 22, 2010 10:04:00 PMEditedOct 30, 2014 02:28:21 PMbyKeter W
On my profile I actually have a note about not fully trusting the oDesk tests because I have run into too much trouble with questions that aren't relevant or well structured. I am a technical writer and have done instructional design, and I know what a well structured quiz looks like.
The technical writing test contains several questions about particular reports that are not something that a technical writer for technology companies will ever come into contact with, so obviously we won't know anything about what goes in them (BTW, neither does Google). It should cover APPROACHES, not specific document types because those will vary widely between employers, and will be controlled by each organization's style guide.
It also asks questions about procedures to do things in Excel 2003 or 2007 (not sure which)...well I now have 2010, and guess what? I don't remember where things were located on the menus in the older versions...precisely because I'm an expert with the program and keep it current! Worse, it is entirely possible to spend an entire career as a technical writer and never need to know how to use Excel (I used to be a bookkeeper and became addicted early on...I use it to manage information, and almost never in actual documentation).
On the grammar test, I was expecting "fix the sentence" or "spot the bad construction" type questions and instead it focused on parts of speech and sentence diagramming stuff. I have been out of school for 30 years, and rarely encountered that sort of grammar teaching when I was there, so I barely remember it! I have style guides and can look something up when required, but my usual approach is 100% practical: "If your sentence is so complicated that we have to look it up, it clearly needs to be rewritten entirely and probably made into two sentences." That's because grammar that stumps a professional writer is going to be completely indecipherable to an ordinary reader. As a former corporate manager of technical writing groups, the "grammar nazis" were the least able to write things others would like to read and also wasted too much time showing off and annoying everyone else - so I actively tried to NOT hire people who walked in touting their mad grammah skilz. Resume + cover letter + samples = all I needed. So that test seems flawed in its very premise from my perspective; it is biased toward recent graduates and rewards an approach to writing that I know results in ego games, lost productivity, and indirect, hard-to-read documents.
Oh, and on the vocabulary test, there is ABSOLUTELY a wrong answer that is being scored as correct. I reported it.
I swear oDesk contractors are writing these questions, and using single sources as sloban mentioned. There are frequent job postings for writing test questions on oDesk...
Nov 23, 2010 08:35:00 AMEditedOct 30, 2014 02:28:22 PMbyJacqueline P
Actually, none of the existing tests were created by oDesk contractors. All (except the ORT) are 100% the responsibility of ExpertRatings (our testing vendor).
Creating tests for various training and online ed programs is fairly common type of job.
Mar 6, 2014 12:14:00 PMEditedOct 30, 2014 05:48:36 PMbyAndrea W
Those tests are a joke suited only for those who work as software technicians. So many questions completely irrelevant to common tasks. For example, one of the photoshop tests pertains exclusively to photographers when most people use it for a lot of other things unrelated to photography.
Nov 24, 2010 11:43:00 PMEditedOct 30, 2014 02:28:24 PMbyLorraine A
I've done all the ones I want my contractors to have so I know how much to trust them.
They suck and are of poor level. In most cases, if I wanted to google the answers I could. There's enough time.
Getting a contractor to do a few tests is a great deterrent to a fraudster because they take time & it scares them as they don't know what to expect. One guy, *removed by admin*, applied for a Joomla experts contract with me. Little did he know I am a Joomla expert and just wanted someone to take the overflow. By the end of discussions I found out that he was a student living in Bangladesh and needed pocket money. He knew nothing about Joomla other than it was a popular job and he only could paste info I gave him which in the process of pasting to him I could paste myself.
You can see how long they take to do the test. If it's basic English spelling and you know you did it in 5 minutes, what the hell are they doing for 18 minutes? Smoking, having a coffee or googling the answer. Seriously!
If they don't want to do the test why should you hire them?
Just some thoughts...
Apr 10, 2011 11:04:17 PMEditedOct 30, 2014 02:28:27 PMbyMohammad H
Lorraine A. wrote:
"You can see how long they take to do the test. If it's basic English spelling and you know you did it in 5 minutes, what the hell are they doing for 18 minutes? Smoking, having a coffee or googling the answer. Seriously!"
Maybe it's because they're just waiting for the questions to appear on the screen.
I think it's always good to keep in mind that the internet connection speed varies widely among us. Here's an example:
Two days ago an employer asked me to download a training video, which he said it's download in about 3 minutes. When I open the link, I see it's a 97 Mb video. Knowing my modem, I download it at an internet shop which have connection speed far more faster than my own modem. I download it in 60+ minutes. And I didn't accuse the employer cheated me because he said it's a 3 minutes download. Lol.