Backpropagation for dummies

Sachin Joglekar's blog

Backpropagation is a method of training an Artificial Neural Network. If you are reading this post, you already have an idea of what an ANN is. However, lets take a look at the fundamental component of an ANN- the artificial neuron.

actfn001

The figure shows the working of the ith neuron (lets call it $latex N_i$) in an ANN. Its output (also called its activation) is $latex a_i$. The jth neuron $latex N_j$ provides one of the inputs to $latex N_i$.

How does $latex N_i$ produce its own activation?

1. $latex N_i$ has stored in itself, a weightage assigned to each of its inputs. Lets say that the weightage assigned by $latex N_i$ to $latex N_j$ is $latex w_{i, j}$. As the first step, $latex N_i$ computes a weighted sum of all its inputs. Lets call it $latex in_i$. Therefore,

$latex in_i = sum_{k in Inputs(N_i)}…

View original post 2,267 剩余字数

Gephi boosts its performance with new “GraphStore” core

Gephi blog

Gephi is a graph visualization and analysis platform – the entire tool revolves around the graph the user is manipulating. All modules (e.g. filter, ranking, layout etc.) touch the graph in some way or another and everything happens in real-time, reflected in the visualization. It’s therefore extremely important to rely on a robust and fast underlying graph structure. As explained in this article we decided in 2013 to rewrite the graph structure and started the GraphStore project. Today, this project is mostly complete and it’s time to look at some of the benefits GraphStore is bringing into Gephi (which its 0.9 release is approaching).

Performance is critical when analyzing graphs. A lot can be done to optimize how graphs are represented and accessed in the code but it remains a hard problem. The first versions of Gephi didn’t always shine in that area as the graphs were using a lot of…

View original post 642 剩余字数