You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a PyTorch implementation of MobileNet v2 network with DeepLab v3 structure used for semantic segmentation.
3
+
4
+
The backbone of MobileNetv2 comes from paper:
5
+
>[Inverted Residuals and Linear Bottlenecks: Mobile Networks for Classification, Detection and Segmentation ](https://arxiv.org/abs/1801.04381v3)
6
+
7
+
And the segment head of DeepLabv3 comes from paper:
8
+
>[Rethinking Atrous Convolution for Semantic Image Segmentation](https://arxiv.org/abs/1706.05587)
9
+
10
+
Please refer to these papers about details like Atrous Convolution, Inverted Residuals, Depthwise Convolution or ASPP if you have some confusion about these blocks.
11
+
12
+
# How to use?
13
+
First you need to install dependencies of this implementation.
14
+
This implementation is written under Python 3.5 with following libs:
15
+
>torch 0.4.0
16
+
torchvision 0.2.1
17
+
numpy 1.14.5
18
+
opencv-python 3.4.1.15
19
+
tensorflow 1.8.0 (necessary for tensorboardX)
20
+
tensorboardX 1.2
21
+
22
+
use `sudo pip install lib` to install them
23
+
24
+
Then, prepare cityscapes dataset or your own dataset.
25
+
Currently, cityscapes is the only supported dataset without any modification.
26
+
27
+
Cityscapes dataset should have the following hierachy:
28
+
```
29
+
dataset_root
30
+
| trainImages.txt
31
+
| trainLabels.txt
32
+
| valImages.txt
33
+
| valLabels.txt
34
+
|
35
+
└───gtFine(Label Folder)
36
+
| └───train(train set)
37
+
| | └───aachen(city)
38
+
| | └───bochum
39
+
| | └───...
40
+
| |
41
+
| └───test(test set)
42
+
| └───val(val set)
43
+
|
44
+
└───leftImg8bit(Image Folder)
45
+
└───train
46
+
└───test
47
+
└───val
48
+
```
49
+
50
+
Third, modify `config.py` to fit your own training policy or configuration
51
+
52
+
At last, run `python main.py --root /your/path/to/dataset/` or just run `python main.py`
53
+
54
+
After training, tensorboard is also available to observe training procedure using `tensorboard --logdir=./exp_dir/summaries`
0 commit comments