We join the Kaggle Competitions. The reason we join the competition is not only that we want to show our abilities in data mining but we also want to improve our skills by facing as many different situations. We will keep participate Kaggle Competitions from now on. We have chosen a customer purchase prediction problem from InstaCart.(Read more…)
Machine Learning is very successful, but its successes crucially rely on human machine learning experts, who select appropriate ML architectures (deep learning architectures or more traditional ML workflows) and their hyperparameters. As the complexity of these tasks is often beyond non-experts, the rapid growth of machine learning applications has created (Read more…)
Backpropagation is a way of calculating the gradient of the loss function with respect to the weights in an artificial neural network. It is commonly used as a part of algorithms that optimize the performance of the network by adjusting the weights over and over until termination conditions are met, for example in the gradient descent algorithm. It is also called backward propagation of errors.The document shows how backpropagation works in artificial neural network. Download Artificial Neural_Network_with Backpropagation in detail and see the detailed calculation steps.
We are going to talk about statistical distance. If we are going to work in multivariate analytical circumstances, statistical distance is a fundamental topic to take a look at and it is a must to understand before we do any distance measurement for similarity and dissimilarity measurements. First of all, let’s think about the Euclidean distance, the straight line distance measurement. If we consider the point in the plane, the straight line distance, , from to the origin according to the Pythagorean theorem and (Read more…)