dennisgorelik: (Default)
Couple of years ago security pen testers found clickjacking bug in Google API Explorer:
Google did pay out a $1,337 bounty
“The idea behind the exploit is to frame the page where that button was, and make the frame transparent.”

Here is the demo of how highjacking setup page looks like:
Hijacker's web site content that invites user to click somewhere:
<p><input type="button" value="Click to see cats' videos"></p>
iframe { 
position: absolute;
top: 0; left: 0; 
filter: alpha(opacity=50); 
opacity: 0.50;
<iframe src="">
Note that Wikipedia's security team made a conscious choice to allow clickjacking of their home page, because there is nothing at risk there.
But if you click "Log in" (or replace "" with "" in the demo html above) - you would notice that Wikipedia login page is not rendered in the iframe.

How did Wikipedia do that?
I opened Fiddler2 debuggin proxy and found out that "" renders this HTTP header:
X-Frame-Options: DENY
But "" page does NOT render that header.

How to prevent clickjacking?
Extra experimenting showed that, and use "X-Frame-Options: SAMEORIGIN" uses "X-Frame-Options: DENY" (the same as Wikipedia login page).
There are three possible directives for X-Frame-Options:
X-Frame-Options: DENY
X-Frame-Options: SAMEORIGIN
X-Frame-Options: ALLOW-FROM

What is the best practice for using "X-Frame-Options"?
I am trying to decide what "X-Frame-Options:" should I use for
Does the flexibility of iframe worth the security risk?
Should we support web site that include content into their own iframe?
Such iframe support has both cons and pros...

Why secure option is not a default choice in browsers?
What do you think, why browsers (such as Google Chrome and Firefox) do not assume "X-Frame-Options: SAMEORIGIN" by default?
If allowing loading your page content into parent iframe is inherently insecure, then such a risky behavior should be explicitly requested, right?
dennisgorelik: (Default)
We host 2 PostJobFree servers on SoftLayer (in their Dallas datacenter).
In the last year I started to get more and more warning signs that SoftLayer is slowly decaying (after acquisition by IBM 3 years ago).
So, finally, I decided to check how good is uptime of

So I created a new "Keyword" monitor on
The monitor checks if "Data Centers" wording was rendered into SoftLayer's home page HTML.
UptimeRobot runs that check every minute.

So, how much uptime does the legendary hosting is able to keep for their web site?
According to UptimeRobot, SoftLayer's home page uptime is a pathetic 99%.
That means that there is 1% change that Softlayer home page is down at any given moment.
Among hosting providers, uptime below 99.9% is considered poor, and uptime above 99.99% is considered good.

According to UptimeRobot, when SoftLayer's home page is up, it has average response time of 681.72ms (about 0.7 seconds, which is kind of OK).

To put things in perspective: PostJobFree home page (that is hosted on dedicated server in SoftLayer) has 100% uptime (99.99%+) and 139ms average response time.

So for now our dedicated servers on SoftLayer still work, but if SoftLayer tech team keep deteriorating, they would eventually mess up their core network too, and then it would bring downtime to our servers as well.

So I am looking for a new hosting provider now.
Would you recommend any?
dennisgorelik: (2009)
When users open my web site I want to know what JavaScript errors users have (if any).
That's why I append this javascript to almost every page on my web site:
window.onerror = function(errmessage, errurl, errline) {
	var params = {
		list: [],
		add: function(name, value) {
			if (value != null) this.list.push(name + '=' + encodeURIComponent(value));
			return this;
		toString: function() {
			if (this.list.length) return '?' + this.list.join('&');
			return '';
	new Image().src='/jeh' + params.add('errmessage', errmessage)
		.add('errurl', errurl)
		.add('errline', errline)
		.add('r', Math.floor((Math.random() * 10) + 1));
That script reports javascript errors from user browser back to our server.
Once per day our server aggregates these errors and emails to developers Javascript Errors report.
This is an example of what that report looks like:

We then review these errors on case-by-case basis and decide whether we want to fix that error or we want to suppress that error from report (because we can not fix it).

Still, there are challenges: sometimes it is hard to separate errors that we can fix from errors we can not fix.
For example, we can not fix the most frequent "Uncaught ReferenceError: google is not defined" error, because it is caused by occasional browsers that do not work well with Google Maps API.
But we do not want to suppress that error either, because sometimes, by mistake, we may introduce problem in our own javascript that would generate the same error messages on mass scale for our users.

See: discussion in ivan-gandhi blog.


dennisgorelik: (Default)
Dennis Gorelik

September 2017

34567 8 9
1011 12131415 16


RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Sep. 21st, 2017 08:38 am
Powered by Dreamwidth Studios