That depends. If you buy ‘premium’ code, new in the original shrinkwrap: about 2 or 3 lines.
However if you agree to second or third hand code copy pasted from elsewhere you might even get as many as 200 lines!
That depends. If you buy ‘premium’ code, new in the original shrinkwrap: about 2 or 3 lines.
However if you agree to second or third hand code copy pasted from elsewhere you might even get as many as 200 lines!
Low pay vs high pay? paid 100 or 200 euros? uh? They were all low paid abysmally low paid. What kind of results would you expect?
I must echo what others have said, paying £100 for this and £200 is really basically the same thing which is 95% or more below the market rate for quality work. I have done IT security work for 20+ years and I can think of probably a dozen ways you could implement a properly secure logon and if I was feeling generous I might chat for you for am hour about how it is done for £200. If you lock your vault with a £4 master lock, you should be in no way surprised that the bad guys got in.
However I must say all this has generated a ridiculous security backlash. In the name of security so many clients I know lock down access to so many things that it makes it extremely difficult to get anything done. For instance a common constraint is no copy/paste functionality. However you can send e-mail and in the e-mail you can’t include attachments, ohh that is convenient – what are we to do, it is like Fort Knox! Or I could just uuencode a file and paste the text into an email, if I wanted to be tricky I could encrypt it before I uuencode it to foil even the most determined e-mail snoops. So all this security and inconvenience has achieved NOTHING, all they did was make it harder for legitimate employees to do what they need to do. Don’t get me started on network security, most firewalls last about 10 minutes, tricky firewalls 2-3 hours to break.
Clearly those that created this study knew next to nothing about security or how it is achieved in a real world context.
I have a friend who works at a bank, and he was looking for a way to get a bunch of files from his work laptop to his home laptop. Nothing nefarious, he just didn’t want to trigger a whole bunch of security audits and deal with HR over something trivial.
His solution was to zip the files, chunk the zip, convert those chunks to QR codes, then VPN in from home, take screenshots and then scan the codes to reconstruct the zip.
Overkill perhaps, but I get the feeling he’s wasted in his current department.
I agree with all of the above – the study is so flawed as to be useless.
It doesn’t begin to even explain why FB engineers – who I would imagine are compensated much more generously and are more thoroughly vetted – would leave millions of passwords in plain text, however. My first thought was surely someone working on the team would complain “this is bad practice, let’s fix this!”
But I know software development. Some engineers are tasked with re-writing a section of the authentication system; they build a prototype password engine – a simple API that lets you associate a new password with an account ID (insert), and then authenticates a password with the account id (look up). Since it’s a prototype, they don’t bother fleshing it out – just store it as plaintext in a file somewhere safe – we’ll come back to that part later, or even better, there’s another department (XYZ) producing the tools we need, and we’ll link it up later.
The prototype is finished, the team is given attaboys and back pats, but they go work on something else. Meanwhile, somewhere in middle management land, they decide that prototype is something we really need right now, can you do it? Zuck is on the line and he really wants this feature. Can you do it? Meanwhile the original team has undergone some re-structuring, there are new people, and the team lead is a promising up and comer, who brashly promises “yeah, we can do it!” And so the prototype is fast tracked to production. A team newbie says “Hey, wait, you can’t store passwords in plain text!”, and a team old-timer remembers, “Yeah, we were going to fix that when XYZ department produces the tool we need. For now, this is just fine, because the file is stored in a safe place.” Everyone nods in agreement.
They might even write a user story to describe the problem: “A user would like to safely store their login credentials on the XYZ departments encrypted, unbreakable tool…”
But the deadline is looming, and QA has just ripped the thing to shreds. Sprint after sprint is dedicated to defects, and user stories are sidelined because they increase MVP points, and we really need to roll this thing out. The fact that they’ve pushed a hastily thrown together prototype into production creates a feedback loop: more defects are filed against it.
Finally target date and they roll the thing out. Back pats and attaboys. It’s buggy, but it works! Well done! Team lead is applauded with a raise (potential middle manager yay! Can finally afford a house in Palo Alto, yay!). On to bigger better things – POs bring in more feature stories and prioritize these, and that one crucial user story is now buried deep in the backlog. Someone might remember “oh, yeah, we should go back and fix that thing…” but since it doesn’t cause immediate harm to the user experience and more importantly the revenue stream, it never gets addressed.
Before or after I run npm install
?
Either that of they’ve been in the habit of keeping bypasses to their own security since it was a dating app for Zuck and friends.
You laugh, but for a long time registry data in Windows was rot13 “encrypted” ROT13 is used in Windows? You’re joking! | Didier Stevens Thus proving the article, I guess
I in no way intend to defend the expected or actual quality of the resulting code; but, behaviorally, grabbing code you don’t necessarily understand but which appears to solve your problem from the internet is a lot like treating the language you are using as having a huge, ad-hoc, standard library; with Google rather than IntelliSense assisting your grovelling through it.
There are, obviously, vastly better and worse ways of doing modularity; with that approach on the low end; but not reinventing the wheel and definitely not attempting to reinvent something you don’t understand rather than just using the implementation provided by someone smarter than you are standard, generally encouraged, practices. (edit: especially in things like ‘implement encryption properly’, where people rolling their own rather than using a library with a lot of vetting and ideally some cryptographers involved at some point has been a source of eye-rolling since forever; though people casually using language libraries that they wouldn’t necessarily be able to re-implement without dusting off their textbooks and doing a bit of careful thinking extends across much of what the libraries offer, save the most trivial convenience features or the cases of quite solid programmers.)
At higher skill levels everything goes better because the modules being plugged into one another are more sanely crafted and the people plugging them together understand their structural characteristics much better; but it’s a difference of degree rather than kind in a lot of cases.
The test was definitely flawed (hint: I’ve got way too much hanging over my head at my day job to think about taking on somebody else’s for a measly couple hundred bucks), but that doesn’t mean there isn’t a real problem, and that some very common real-world engineering practices make it more likely. Agile development stresses doing the minimum work necessary to meet the stated acceptance criteria for a story written by a usually-non-technical customer or product manager. Ruthless minimalism is often socially rewarded. Finding solutions on Stack Overflow is a common and accepted way to get things done. Organizations value fungibility among developers, and developers are accustomed to LEGOing together relatively accessible APIs without needing deep knowledge of the problems or solutions an API addresses.
An area like application security is harmed by all these trends. Getting security right requires real firsthand expertise, deep consideration, extreme caution around incrementalism, and an understanding of subtle nuances implied by the particulars of your own specific application, which can be a stumbling block for even a seasoned developer. It’s no wonder at all we as an industry fuck this up regularly.
If someone doesn’t know what an IV is or how big a salt you should be using, they probably should stay away from any API which lets them make those choices directly.
If there’s one thing I learned from my semester of college level crypto theory it’s that I should avoid messing with raw cryptographic operations if at all possible, and build the secure bits of my app from the highest-level existing blocks available, preferably the most widely-used (and audited) ones.
Even then, it takes a fair amount of knowledge to match your application’s security needs with something that provides the necessary security properties, and to actually put the blocks together in a way that doesn’t subtly break those properties.
A lot of the APIs don’t give you a choice. They just say “pass the IV in here”. And then people are amazed when even giant corporations like Sony fuck it up and ruin the crypto on their Marquee product.
This was apparently done for a good reason. I’m not able to find the blog post about this anymore but I believe this was done so programs in the “most frequently used” list wouldn’t show up in search results. It was more of an obfuscation than serious attempt at encryption.
A one-hour job wouldn’t be worth taking for a professional - once you count set-up, ramp-up (understanding requirements and constraints), and billing time, there’d be at best a few minutes left to work if it was a super-simple task. Maybe enough time to copy-paste a few pages from the first old tutorial you find. (Which seems to fit with the results they got.)
About 7-10 years ago or so, hourly rate for a lone unknown freelancer would be somewhere within that 100-200 range. But a project like that would be several hours, therefore, several times what they offered.
I think the purpose of the test was not so much the lack of security concerns in slap-dash jobs, but to see if offering more pay had any effect on the quality of code delivered. To test the idea some managers have that bad code comes because you didn’t pay enough.
Which, to be fair, is a common illusion that the quality of work delivered is scaled to the amount that you offer to pay. And though the amount seems paltry to us, we don’t really know the details of what was expected. Perhaps it was only expected that the Java devs Lego together a CRUD API instead of write custom code, that the only thing they were interested in was if the devs thought to make password storage secure or not.
No test is truly without prejudice, and I suspect the scientists who devised this one had an idea of what to expect and merely confirmed their expectations. Programmers will only do what is expected, and will not implement secure practices unless you explicitly tell them to. They will not cover your ass, they are mercenaries.
The average staff tenure at Facebook is 2.5 years[0]. Most people writing the web software that you use every day might as well be €100 code-and-dash randos.
As long as tech companies only offer raises to job-hoppers it’s hard to imagine this problem getting better.
I am happy the Cory referenced the OWASP password cheat sheet - it does provide good advice. I would go further with password complexity advice in systems upon which I have influence, but this is a good start. If you already have a system that needs to have the passwords protections upgraded, then the OWASP document also provide advice on that too; which happens to be written by me.
I wrote a long and detailed article about it because a very large bank was still using only MD5 for their existing, external-facing systems. They recognized the problem, but didn’t have a solution that didn’t require the resetting of all user’s passwords at once. I gave them the three options presented in my article and they chose one very similar to the demo that is shown. Other companies have used by starting points to update their system and privately thanked me. If you have a system that needs updating, feel free to reach out and I’ll try to give some advice.
I’m not sure requiring a course to use a crypto library is bad.
Some things are hard and require lots of work to master.
Maybe it’s not possible for 1 Lone Engineer to code entire applications anymore.
IMHO a lot of companies would do well to hire 3-4 people that can grok logic + have a specialty (security, front end design, backend, etc) rather than one “rock star”.
Wait, so you’re telling me something as complex as a maglev train probably can’t be designed, built and operated by just one guy?
This topic was automatically closed after 5 days. New replies are no longer allowed.