You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: doc/tutorials/quick_start/index_en.md
+8-7Lines changed: 8 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ This tutorial will teach the basics of deep learning (DL), including how to impl
12
12
13
13
To get started, please install PaddlePaddle on your computer. Throughout this tutorial, you will learn by implementing different DL models for text classification.
14
14
15
-
To install PaddlePaddle, please follow the instructions here: <ahref = "../../build/index.html" >Build and Install</a>.
15
+
To install PaddlePaddle, please follow the instructions here: <ahref = "../../getstarted/build_and_install/index_en.html" >Build and Install</a>.
16
16
17
17
## Overview
18
18
For the first step, you will use PaddlePaddle to build a **text classification** system. For example, suppose you run an e-commence website, and you want to analyze the sentiment of user reviews to evaluate product quality.
You can refer to the following link for more detailed examples and data formats: <ahref = "../../ui/data_provider/pydataprovider2.html">PyDataProvider2</a>.
159
+
You can refer to the following link for more detailed examples and data formats: <ahref = "../../api/data_provider/pydataprovider2_en.html">PyDataProvider2</a>.
160
160
161
161
## Network Architecture
162
162
You will describe four kinds of network architectures in this section.
163
163
<center>  </center>
164
164
165
165
First, you will build a logistic regression model. Later, you will also get chance to build other more powerful network architectures.
166
-
For more detailed documentation, you could refer to: <ahref = "../../ui/api/trainer_config_helpers/layers_index.html">Layer documentation</a>。All configuration files are in `demo/quick_start` directory.
166
+
For more detailed documentation, you could refer to: <ahref = "../../api/trainer_config_helpers/layers.html">layer documentation</a>. All configuration files are in `demo/quick_start` directory.
167
167
168
168
### Logistic Regression
169
169
The architecture is illustrated in the following picture:
@@ -366,7 +366,7 @@ You can use single layer LSTM model with Dropout for our text classification pro
366
366
<br>
367
367
368
368
## Optimization Algorithm
369
-
<ahref = "../../ui/api/trainer_config_helpers/optimizers.html">Optimization algorithms</a> include Momentum, RMSProp, AdaDelta, AdaGrad, Adam, and Adamax. You can use Adam optimization method here, with L2 regularization and gradient clipping, because Adam has been proved to work very well for training recurrent neural network.
369
+
<ahref = "../../api/trainer_config_helpers/optimizers.html">Optimization algorithms</a> include Momentum, RMSProp, AdaDelta, AdaGrad, Adam, and Adamax. You can use Adam optimization method here, with L2 regularization and gradient clipping, because Adam has been proved to work very well for training recurrent neural network.
370
370
371
371
```python
372
372
settings(batch_size=128,
@@ -391,7 +391,8 @@ paddle train \
391
391
--use_gpu=false
392
392
```
393
393
394
-
If you want to install the remote training platform, which enables distributed training on clusters, follow the instructions here: <ahref = "../../cluster/index.html">Platform</a> documentation. We do not provide examples on how to train on clusters. Please refer to other demos or platform training documentation for mode details on training on clusters.
394
+
We do not provide examples on how to train on clusters here. If you want to train on clusters, please follow the <ahref = "../../howto/cluster/cluster_train_en.html">distributed training</a> documentation or other demos for more details.
395
+
395
396
## Inference
396
397
You can use the trained model to perform prediction on the dataset with no labels. You can also evaluate the model on dataset with labels to obtain its test accuracy.
397
398
<center>  </center>
@@ -406,7 +407,7 @@ paddle train \
406
407
--init_model_path=./output/pass-0000x
407
408
```
408
409
409
-
We will give an example of performing prediction using Recurrent model on a dataset with no labels. You can refer to: <ahref = "../../ui/predict/swig_py_paddle_en.html">Python Prediction API</a> tutorial,or other <ahref = "../../demo/index.html">demo</a> for the prediction process using Python. You can also use the following script for inference or evaluation.
410
+
We will give an example of performing prediction using Recurrent model on a dataset with no labels. You can refer to <ahref = "../../api/predict/swig_py_paddle_en.html">Python Prediction API</a> tutorial,or other <ahref = "../../tutorials/index_en.html">demo</a> for the prediction process using Python. You can also use the following script for inference or evaluation.
410
411
411
412
inference script (predict.sh):
412
413
@@ -508,7 +509,7 @@ The scripts of data downloading, network configurations, and training scrips are
508
509
*\--config_args:Other configuration arguments.
509
510
*\--init_model_path:The path of the initial model parameter.
510
511
511
-
By default, the trainer will save model every pass. You can also specify `saving_period_by_batches` to set the frequency of batch saving. You can use `show_parameter_stats_period` to print the statistics of the parameters, which are very useful for tuning parameters. Other command line arguments can be found in <ahref = "../../ui/index.html#command-line-argument">command line argument documentation</a>。
512
+
By default, the trainer will save model every pass. You can also specify `saving_period_by_batches` to set the frequency of batch saving. You can use `show_parameter_stats_period` to print the statistics of the parameters, which are very useful for tuning parameters. Other command line arguments can be found in <ahref = "../../howto/cmd_parameter/index_en.html">command line argument documentation</a>。
0 commit comments