Indesk Hijau Biru Sebagai Terobosan Dalam Pemenuhan Ruang Terbuka Hijau

Bagaimana cara indeks hijau biru Indonesia dapat membantu memenuhi kebutuhan rph yang diamanahkan oleh undang-undang 26 2007 di mana 10% rph di kota atau kawasan perkotaan itu ada 20% publik dan 10%…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Hyperbolic Bayesian Graph Neural Networks

I think this topic is really interesting for future publications, or some PhD thesis. I couldn’t find anything online in this direction, so I will write my thoughts here. I would like to explain my view about this so that it would be a lot easier for you to try it out.

So, let’s begin. First, I would like to say a few words why I think hyperbolic geometry is really useful here. Ten years ago, when I worked on my master thesis at the Department of Mathematics at the University of Sarajevo, my focus was on asymptotic behavior of Selberg zeta function through degeneration of hyperbolic manifolds. Hyperbolic manifolds are interesting because analogous of Riemann hypothesis holds for the Selberg zeta function on compact Riemann surfaces. At that time, I just didn’t have any idea that hyperbolic geometry could be applied somewhere in applied mathematics, especially in Data Science.

After that, I finished my PhD in theoretical mathematics and I just wanted to apply everything I learned. So, I entered Machine learning. During my free time I analyzed all of those possible mix areas of research between Mathematics and Machine learning. In 2018 I started working as a Data Scientist and one of the first tasks was to build NLP model for sentiment analysis. I realized that Facebook gave weights for word2vec for free, so you could download them and use as input into your own model. That is really great from them and we all used them, but one question came to my mind: Why this embedding space is so large? Why 100 and not something smaller? Do we really need 100 numbers to represent one word?

Then I realized that Facebook AI Research team released the paper “Poincare embeddings for learning hierarchical representations” in 31st Conference on Neural Information Processing Systems (NIPS 2017). I immidiately read it and realized that in hyperbolic space one word can be represented as not so many numbers. Maybe 5 is totally enough. That was a real trigger for me to think about Facebook’s internal implementation in hyperbolic way. They gave us Euclidean weights for free, but internally they did it by using Hyperbolic space. Maybe I was totally wrong, but if we take a look into further development, we realize that more papers is published in that direction by Facebook AI Research team.

Just imagine how much memory is saved by decreasing from 100 to 5. It is not just one word, but sentences, and there are a lot of them. After two years they published a paper “Hyperbolic Graph Neural Networks” at the 33rd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, Canada.

Just think about it for a moment. Facebook is using graphs as a tool in order to learn everything about us. They published a paper where they apply hyperbolic geometry in that direction. That just confirmed my hypothesis about their internal usage of hyperbolic geometry. That is really amazing to see that one great company is using high level mathematics in order to obtain better results. For a mathematician who worked in hyperbolic geometry, that is really satisfaction to see and understand.

It is not just Facebook who likes hyperbolic space. No, it is Stanford Research team as well. At the same conference they published paper “Hyperbolic Graph Convolutional Neural Networks”. It is really amazing to see that they continued working on that and you now have a lot of material in that direction. One of the authors, Jure Leskovec, has also one lecture “Hyperbolic graph embeddings” that you can find on YouTube on his course “Machine Learning with Graphs”. We don’t have to say anything more to say that Facebook and Stanford really like this hyperbolic space, just like myself.

On the other hand, if you want to include uncertainty of the model, then you should try Bayesian deep learning. But, can we do the same for Graph Neural Networks? Of course. Take a look into “A Survey On Bayesian Graph Neural Networks”, published at 13th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), 2021.

What I didn’t see is that mix between “Hyperbolic Graph Neural Networks” and “Bayesian Graph Neural Networks”. We can call that mix “Hyperbolic Bayesian Graph Neural Networks”. Crazy, isn’t it? Well, I am writing that idea here in order for you to continue and start that multidisciplinary area. I think that Facebook and Stanford would probably like it.

But, how to start? Well, we have Google for that. Not to Google it, but to use its two wonderful libraries: Tensorflow Probability and Tensorflow GNN. First library is amazing in the direction of Bayesian Deep Learning and the second one is for Graph Neural Networks (both are open source). You only need to figure out how to write those hyperbolic layers in Keras, which can be found in libraries like hyperlib. For those of you who are not TensorFlow oriented, there are appropriate libraries for PyTorch. Just Google them :)

That’s it from me and happy new Hyperbolic Bayesian Graph Neural Network year.

Add a comment

Related posts:

Knowing the Most obvious opportunity to Buy A Home

Numerous people wonder what the most fitting open door to buy another house is. They consider buying when they see that the expenses of homes will start to go higher shortly. Buying Matthews homes at…

Chuvashia gas pipeline is on fire and russian carrier Admiral Kuznetsov as mystery fires in Russia continue to wreak havoc

A mysterious series of blazes continues to trail through Russia. The causes for many of these explosions are still unknown. In my opinion, there is a wide range of possible explanations: Partisan…

What Is Cryptocurrency?

Cryptocurrency is a type of digital currency that operates independently of central banks and traditional financial institutions. Here is a brief overview of how it works: 1. Cryptographic security…