aboutsummaryrefslogtreecommitdiff
path: root/README.md
blob: 69d329db716c4aab5d99669da07e043895a52302 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
Simple search engine - but you can search nothing for the moment

# Installing

It is recommended to use [virtualenv](https://virtualenv.pypa.io).

```
pip install -r requirements.txt
```

## Testing

If you just want to test and don't want to install a PostgreSQL database
but have Docker installed, juste use the `docker-compose.yml`.

This is only for test, don't use this shit on production (the docker-compose
file)!

## Sphinx-search / Manticore-search

You can use [Sphinx-search](http://sphinxsearch.com/) but it's recommand to use
[Manticore-search](https://manticoresearch.com/) since the last version of
sphinx-search is ditribued in closed-source instead of open-source (for
version 3.x).

All explication is for Manticore-search for the moment.

# Crawling

For now there is an example spider with neodarz website.
For testing it just run:

```
python app.py
```

The database is in the sqlite file `khanindexer.db` at the root of the project.

# Indexing

Before lauch indexing or searching command you must verifiy that the folder of
`path` option is present in your system (Warning: the last word of the `path`
option is the value of the `source` option, don't create this folder but only
his parent folder).

Example with the configuration for the indexer `datas`:

```
index datas {
    source = datas
    path = /tmp/data/datas
}
```
Here the folder is `/tmp/data/`

The command for indexing is:
```
indexer --config sphinx_search.conf --all
```

# Searching

Before you can make search, you must lauch the search server
```
searchd -c sphinx_search.conf
```

Example search command:
```
curl -X POST '127.0.0.1:8080/search' -d 'index=datas&match=@content livet&select=id&limit=5' --output -
```