"Password protected" pages with JavaScript

This article is going to be very short[citation needed]. I am going to talk primarily about neocities here, but it applies to all similar hosting places with no allowed server side scripting.

If you like to check out other people's neocities websites, you might have noticed some pages have some form of password protection. There's a little form, you type in your password and you're let in. But how do they do that? Neocities.org doesn't let you use server side scripting, you definitely can't do it on OS level here... One simple way how to achieve that is JavaScript.

Content:
  1. Limitations
    1. Security through obscurity
      1. Keep the secret hidden
  2. Code
    1. Demo
  3. Tell web search engines to not crawl the secret page
    1. robots.txt
    2. Meta tag robots

Limitations

Before I show you how to write a simple code snippet that can let you do the same, I would like to talk about limitations of this approach. JavaScript is, in the sense we're talking about here (unless you're using something like node.js), client side scripting language. That means all the code that's going to run is sent to the client (user's web browser). In turn, that means the client (user) has full access to read or modify (or turn off) the code. It's not too secret.

Security through obscurity

We can't run scripts on the server side, so how can we securely verify the user? Well, we can't. We could try checking for password in JavaScript (also shortened to JS), but as mentioned above, there's no use to that since readers of our website could change the code at any time. Instead, we will count on our secret. Secret that only chosen ones would know. Address of the page we want to hide from others.

Keep the secret hidden

What this means is we will use the address of our secret page as a "password". (It looks much cooler to put it in as a password than typing it in our web browser.) That requires secrecy, because the moment someone knows the URL of the secret page, they can get in. This means your readers should never share the link with anyone, you should never link directly to the secret page, no other website should ever link directly to the secret page. And just to be on the safe side, don't link directly to other websites from the secret page. Referrer in server logs could let others know about it.

Code

I know, I know. You're saying get on with it! You want the code, I just wanted to warn you (above) this is not the same as when you log in to your Instagram account.


<form name="login" onsubmit="return false">
Password: <input type="password" name="password">
<input type="submit" value="Knock knock!" onclick="window.location.href = '[preceding path]' + document.login.password.value + '[file extension]' ">
</form>

There's a few things to point out. Consider an example URL for a secret page: example.com/something/secret.html

[preceding path]
This should be changed to a path leading towards to the file. In our example example.com/something/secret.html, the part you would want to put there would be /something/.
document.login.password.value
This is the part taken from the password field (what readers type in). In our example, that makes it secret in example.com/something/secret.html
[file extension]
File extension is the part after the period (dot) in file names. So in our example: example.com/something/secret.html, it's .html.

What this code does is it takes our "password", puts it in the path we told the script (see above), and then redirects us there. The redirection works through window.location.href that changes address for the active window. Of course, the reader has to have JS turned on.

Demo

Try the password: swordfish

Password:

Tell web search engines to not crawl the secret page

Note: This will only solve the problem of your secret web page showing up in web search results, people will still be able to find your page if someone links to it.

robots.txt

What you can do, in case a link to your secret page slips in, is using robots.txt to tell web crawlers (programs that add web pages to web search engines' database) to ignore your web page. The big ones (Google, Bing, etc) all respect robots.txt and won't show the page in web search results. However, I'd be careful, because if any crawler (or reader, for that matter) is less polite, it can use your robots.txt file to discover your secret pages.

If you'd like to use this, create a new file in your root directory called robots.txt and add following:


User-agent: *
Disallow: /something/secret.html

User-agent: * means all crawlers should follow this, Disallow: /something/secret.html means they should ignore /something/secret.html from our example above.

Note: Disallow will work with directories (Disallow: /something/ from our example) or even / (which would apply to the whole website, as it refers to the root directory of the website).

Meta tag robots

Better option (so you don't tell rude crawlers about your secret webpage) would be using a meta tag directly in your secret page. Put it between <head></head> tags in the page.

<meta name="robots" content="noindex, nofollow">

Note: Of course, if bad web crawlers already discovered your secret page, they could ignore the meta tag, too. In that case, changing the URL is probably the best option.

That's all, folks!

Well, that's it from me for today. If you have any questions about the code or anything else in this article, feel free to contact me on fedi @bugbear@alt.lawndale.space


see you, space cowboy

Share on fediverse: