README.md 2.16 KB
Newer Older
1 2 3
# Live-Demo
The demo can be found at [https://nvk.entless.org](https://nvk.entless.org)
# Setup
Niklas Callsen's avatar
Niklas Callsen committed
4 5 6 7 8 9 10 11 12
## Docker setup
The root folder contains a `docker-compose.yml`. Since [all configuration should be done via environment variables](https://12factor.net/de/config)
you can define all necessary URLs and ports in `docker-compose.yml`.
Still all images are configured for use in production. 
There are four containers (Elasticsearch, Client, Server and init).

## Manual setup
### Elasticsearch
#### index creation and mapping for elasticsearch
Niklas Callsen's avatar
Niklas Callsen committed
13
We assume that elastic search is running on [localhost:9200](localhost:9200)
Niklas Callsen's avatar
Niklas Callsen committed
14 15 16 17 18 19 20 21 22 23 24 25 26 27
The following request will create an index as well as a mapping for it.
```
curl -XPUT localhost:9200/family2coords -H "Content-Type: application/json" -d '{"mappings" : {
        "doc" : {
            "properties" : {
                "year": { "type" : "keyword" },
                "name" : { "type" : "keyword" },
                "location": {
                    "type": "geo_point"
                }
            }
        }
    }
}'
Niklas Callsen's avatar
Niklas Callsen committed
28
```
Niklas Callsen's avatar
Niklas Callsen committed
29
#### data insertion
30
The data can for example be found at: [https://entless.org/~niklas/family2coords.zip](https://entless.org/~niklas/family2coords.zip)
Niklas Callsen's avatar
Niklas Callsen committed
31
use the csv2e-script to add data to the new index
32
in `csv2es/`
Niklas Callsen's avatar
Niklas Callsen committed
33
```
Niklas Callsen's avatar
Niklas Callsen committed
34 35 36
npm install &&
./csv2es.js family2coords_1996.csv 1996 && 
./csv2es.js family2coords_1890.csv 1890
Niklas Callsen's avatar
Niklas Callsen committed
37
```
38 39 40
Alternatively you can use `init.js`.
It assumes there are two files in `csv2es/` called:
`familyname2coordinate_1890.csv.gz` and `familyname2coordinate_1996.csv.gz`
Niklas Callsen's avatar
Niklas Callsen committed
41 42
### server and client
#### Start server
43
in `server/`
Niklas Callsen's avatar
Niklas Callsen committed
44 45 46
```
npm install && npm start
```
Niklas Callsen's avatar
Niklas Callsen committed
47
starts server on [localhost:3000](localhost:3000)
Niklas Callsen's avatar
Niklas Callsen committed
48
(check `package.json` if you need it on another port)
Niklas Callsen's avatar
Niklas Callsen committed
49

Niklas Callsen's avatar
Niklas Callsen committed
50
#### Start client
Niklas Callsen's avatar
Niklas Callsen committed
51
in `/client/` 
Niklas Callsen's avatar
Niklas Callsen committed
52 53 54
```
npm install && npm start
```
Niklas Callsen's avatar
Niklas Callsen committed
55
starts client on [localhost:4200](localhost:4200)
Niklas Callsen's avatar
Niklas Callsen committed
56 57 58 59 60

or in `/client`
```
npm run build
```
61 62 63 64
builds app, build in `/client/dist/`

# Problems
The insertion process of feeding the data into Elasticsearch is machine dependent. This means that there are possibilities of request timeouts on slow machines for example. If this happens to you, you need to change some of the specifics of the insertion in `csv2es/csv2es.js`.