I have an SPA built using create-react-app
and wish to have a robots.txt like this:
http://example.com/robots.txt
I see on this page that:
You need to make sure your server is configured to catch any URL after it's configured to serve from a directory.
But for firebase hosting, I'm not sure what to do.
In my /public
directory, I created a robots.txt
.
In my /src
directory, I did the following:
I created /src/index.js
:
import React from 'react'
import ReactDOM from 'react-dom'
import {TopApp} from './TopApp'
import registerServiceWorker from './registerServiceWorker'
import {BrowserRouter} from 'react-router-dom'
ReactDOM.render(
<BrowserRouter>
<TopApp/>
</BrowserRouter>,
document.getElementById('react-render-root')
)
registerServiceWorker()
I created /src/TopApp.js
:
import React from 'react'
import {
Switch,
Route
} from 'react-router-dom'
import {ComingSoon} from './ComingSoon'
import {App} from './App'
export class TopApp extends React.Component {
render() {
return (
<div className="TopApp">
<Switch>
<Route path='/MyStuff' component={App}/>
<Route exact path='/' component={ComingSoon}/>
</Switch>
</div>
)
}
}
Because path /robots.txt
is not covered by the router paths provided, it took it from my public directory and robots file was published as desired.
The same could be done for sitemap.xml
.