Word2Vec Model

An Not-So-Gentle Introduction to Word2Vec Model

Introduction Word2Vec was introduced by a team of researchers at Google in 2013, which is one of the most widely used neural network language models. The underlying assumption of Word2Vec is that two words sharing similar contexts also share a similar meaning (Distributional Hypothesis) and consequently a similar vector representation from the model. For instance, “dog”, “puppy” and “pup” are often used in similar situations, with similar surrounding words like “good”, “fluffy” or “cute”, and according to Word2Vec they will therefore share a similar vector representation. [Read More]

Hong Kong Through A Lens

Capture The Moments

Photos taken by Huawei P20.

photo  hk