Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
524 views
in Technique[技术] by (71.8m points)

node.js - What is the smartest way to handle robots.txt in Express?

I'm currently working on an application built with Express (Node.js) and I want to know what is the smartest way to handle different robots.txt for different environments (development, production).

This is what I have right now but I'm not convinced by the solution, I think it is dirty:

app.get '/robots.txt', (req, res) ->
  res.set 'Content-Type', 'text/plain'
  if app.settings.env == 'production'
    res.send 'User-agent: *
Disallow: /signin
Disallow: /signup
Disallow: /signout
Sitemap: /sitemap.xml'
  else
    res.send 'User-agent: *
Disallow: /'

(NB: it is CoffeeScript)

There should be a better way. How would you do it?

Thank you.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Use a middleware function. This way the robots.txt will be handled before any session, cookieParser, etc:

app.use('/robots.txt', function (req, res, next) {
    res.type('text/plain')
    res.send("User-agent: *
Disallow: /");
});

With express 4 app.get now gets handled in the order it appears so you can just use that:

app.get('/robots.txt', function (req, res) {
    res.type('text/plain');
    res.send("User-agent: *
Disallow: /");
});

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...