
上QQ阅读APP看书,第一时间看更新
How to do it...
Logistic regression serves as a building block for complex neural network models using sigmoid as an activation function. The logistic function (or sigmoid) can be represented as follows:

The preceding sigmoid function forms a continuous curve with a value bound between [0, 1], as illustrated in the following screenshot:

Sigmoid functional form
The formulation of a logistic regression model can be written as follows:

Here, W is the weight associated with features X= [x1, x2, ..., xm] and b is the model intercept, also known as the model bias. The whole objective is to optimize W for a given loss function such as cross entropy. Another view of the logistic regression model to attain Pr(y=1|X) is shown in the following figure:

Logistic regression architecture with the sigmoid activation function