Meeting #2054
Updated by Evgeniy Pavlovskiy almost 6 years ago
h1. 1 Schedule
[[Seminars_schedule]]
h1. 2 Requirements
25 minutes for one presentation.
Main requirements to presentation:
* to be prepared in LaTeX,
* to be short, understandable, clear and convinient,
* no more than 20 minutes for content deliver and 5 for questions,
* references on the last slide
h1. 3 Topics
Each student has to present a research and part of his thesis.
Opened list of cutting-edge topics:
|_.Topic|_.Link|_.Reporter|_.Scheduled|
|MXNet DL framework|Chen T. et al. Mxnet: A flexible and efficient machine learning library for heterogeneous distributed systems //arXiv preprint arXiv:1512.01274. – 2015. URL: https://arxiv.org/pdf/1512.01274|?|?|
|Manifold MixUp| Manifold Mixup: Better Representations by Interpolating Hidden States. URL: https://arxiv.org/pdf/1806.05236v4 |-| |
|SphereFace| SphereFace: Deep Hypersphere Embedding for Face Recognition URL: https://arxiv.org/pdf/1704.08063.pdf|-| |
|UMAP | McInnes, Leland and John Healy (2018). “UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction”. In: ArXiv e-prints. arXiv: "1802.03426 [stat.ML]":http://arxiv.org/abs/1802.03426|||
| DisCoCat model| Grefenstette E. Category-theoretic quantitative compositional distributional models of natural language semantics //arXiv preprint arXiv:1311.1539. – 2013. URL: https://arxiv.org/abs/1311.1539|*Elizaveta Tagirova*|Feb-28|
|DisCoCat toy model|Gogioso S. A Corpus-based Toy Model for DisCoCat //arXiv preprint arXiv:1605.04013. – 2016. URL: https://arxiv.org/pdf/1605.04013.pdf|*Anik Chakrabarthy*|Mar-14|
|Triplet Loss|https://arxiv.org/pdf/1503.03832.pdf|*Alexandra Luchkina*|Mar-7|
|Style transfer SotA (state-of-the-art)| A Style-Based Generator Architecture for Generative Adversarial Networks. URL: https://arxiv.org/abs/1812.04948 |*Klim Markelov* | Mar-14|
|Pixel Recurrent Neural Networks | Oord A., Kalchbrenner N., Kavukcuoglu K. Pixel recurrent neural networks //arXiv preprint arXiv:1601.06759. – 2016. URL:https://arxiv.org/abs/1601.06759|*Omid Razizadeh*|Mar-14|
|Performance of Word Embeddings|review and experience|*Andrey Zubkov*|Mar-7|
|CosFace|Wang H. et al. Cosface: Large margin cosine loss for deep face recognition //Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. – 2018. – С. 5265-5274. URL: https://arxiv.org/pdf/1801.09414.pdf|*Akilesh Sivaswamy*|Mar-7|
|CNN for speech command recognition. Review||*Roman Kozinets*|Apr-4|
|Text clustering | Xu J. et al. Self-taught convolutional neural networks for short text clustering //Neural Networks. – 2017. – Т. 88. – С. 22-31. URL: https://arxiv.org/abs/1701.00185 |*Evgeniy Kurochkin*|Mar-7|
|A Quantum-Theoretic Approach to Distributional Semantics|Blacoe W., Kashefi E., Lapata M. A quantum-theoretic approach to distributional semantics //Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. – 2013. – С. 847-857. URL: http://www.aclweb.org/anthology/N13-1105|*Ivan Rogalsky*|Apr-4|
|State of the art Deep Learning - Intelligent Network Traffic Control Systems|State of the art Deep Learning: Evolving Machine Intelligence Toward Tomorrow's Intelligent Network Traffic Control Systems. URL: https://ieeexplore.ieee.org/document/7932863 |*Ravi Kumar*|Mar-28|
|Solving the Quantum Many-Body problem with ANN| Carleo G., Troyer M. Solving the quantum many-body problem with artificial neural networks //Science. – 2017. – Т. 355. – №. 6325. – С. 602-606. URL: https://arxiv.org/pdf/1606.02318|*Munjaradzi Njera*|Mar-28|
|Predicting Oil Movement in a development System Using Deep Latent Dynamic Models|URL: Video: https://www.youtube.com/watch?v=N3iV-F4aqLA? Slieds: https://bayesgroup.github.io/bmml_sem/2018/Temirchev_Metamodelling.pdf|*Jetina Tsvaki*|Apr-18|
|Artificial Intelligence Safety and Cybersecurity: a Timeline of AI Failures| Yampolskiy R. V., Spellchecker M. S. Artificial Intelligence Safety and Cybersecurity: A Timeline of AI Failures (2016) //arXiv preprint arXiv:1610.07997. URL: https://arxiv.org/abs/1610.07997|*Thomas Vialars*|?|
|Review on master topic||*Seth Gyamerah*|?|
|Using ML for Network Intrusion Detection|Sommer R., Paxson V. Outside the closed world: On using machine learning for network intrusion detection //2010 IEEE symposium on security and privacy. – IEEE, 2010. – С. 305-316. URL: https://www.computer.org/csdl/proceedings/sp/2010/6894/00/05504793.pdf|*Dylan Bersans*|Apr-18|
|Tacotron 2|Shen J. et al. Natural tts synthesis by conditioning wavenet on mel spectrogram predictions //2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). – IEEE, 2018. – С. 4779-4783. URL: https://arxiv.org/pdf/1712.05884.pdf|*Leyuan Sheng*|Apr-11|
|BERT (Google)|Devlin J. et al. Bert: Pre-training of deep bidirectional transformers for language understanding //arXiv preprint arXiv:1810.04805. – 2018. URL: https://arxiv.org/abs/1810.04805|||
|Universal Sentence Encoder|Cer D. et al. Universal sentence encoder //arXiv preprint arXiv:1803.11175. – 2018. URL: https://arxiv.org/pdf/1803.11175.pdf|||
|ULMFiT|Howard J., Ruder S. Universal language model fine-tuning for text classification //arXiv preprint arXiv:1801.06146. – 2018. URL: https://arxiv.org/pdf/1801.06146.pdf|*Juan Pinzon*|Mar-21| https://arxiv.org/pdf/1801.06146.pdf|||
|ELMo|Peters M. E. et al. Deep contextualized word representations //arXiv preprint arXiv:1802.05365. – 2018. URL: http://www.aclweb.org/anthology/N18-1202|||
|Skip-thoughts, Infersent, RandSent - Facebook|1. Kiros R. et al. Skip-thought vectors //Advances in neural information processing systems. – 2015. – С. 3294-3302. URL: https://arxiv.org/pdf/1506.06726.pdf
2. Conneau A. et al. Supervised learning of universal sentence representations from natural language inference data //arXiv preprint arXiv:1705.02364. – 2017. URL: https://arxiv.org/abs/1705.02364
3. Wieting J., Kiela D. No Training Required: Exploring Random Encoders for Sentence Classification //arXiv preprint arXiv:1901.10444. – 2019. URL: https://arxiv.org/pdf/1901.10444.pdf|||
h1. 4 Topics of master thesis
Opened list of reports on master thesis (statement of work, review, and results):
|_.Reporter|_.Topic|_.Scheduled|
|\3=.1st year students|
|1 Chakrabarti Anik | |May-16|
|2 Razizadeh Omid | |May-16|
|3 Siyoto Owen | ||
|4 Munyaradzi Njera | |May-23|
|5 Averyanov Evgeniy | ||
|6 Kozinets Roman | |May-16|
|7 Melnikov Arsentiy | ||
|8 Rogalsky Ivan |Open System Categorical Quantum Semantics in NLP (master thesis, review) |Apr-18|
|9 Urynbassarov Mukhtar | ||
|10 Yakovenko Olga | ||
|11 Tagirova Elizaveta | ||
|12 Tsvaki Jetina | |May-16|
|13 Ravi Kumar | |Apr-18|
|\3=.2st year students|
|1 Leyuan Sheng | |Apr-25 |
|2 Akilesh Sivaswamy | |Apr-4 |
|3 Juan Fernando Pinzon Correa | |Apr-11 |
|4 Gyamerah Seth | | |
|5 Fishman Daniil | | |
|6 Gusev Petr | | |
|7 Kurochkin Evgeniy | |Apr-25 |
|8 Luchkina Anastasia | |Apr-25 |
|9 Malysheva Anastasia | | |
|10 Sergeev Artem | | |
|11 Zubkov Andrey | |Apr-25 |
|12 Poteshkin Vitaly | | |
|13 Markelov Klim | |Apr-11 |
|14 Tussupova Madina | | |
|15 Marinov Andrey | | |
h1. 5 At fault
These students still didn't selected a paper to report or doesn't assigned to a time slot:
1st year students
* *Owen Siyoto*: refer, master
* *Evgeniy Averyanov*: refer, master
* *Melnikov Arsentiy*: refer, master
* *Mukhtar Urynbassarov*: refer, master
* *Olga Yakovenko*: refer, master
* *Elizaveta Tagirova*: master
* *Thomas Vialars*: refer - time slot
2nd year students
* *Fishman Daniil*: refer, master
* *Gusev Petr*: refer, master
* *Malysheva Anastasia*: refer, master
* *Poteshkin Vitaly*: refer, master
* *Tussupova Madina*: refer, master
* *Marinov Andrey*: refer, master
* *Gyamerah Seth*: refer - timeslot, master
[[Seminars_schedule]]
h1. 2 Requirements
25 minutes for one presentation.
Main requirements to presentation:
* to be prepared in LaTeX,
* to be short, understandable, clear and convinient,
* no more than 20 minutes for content deliver and 5 for questions,
* references on the last slide
h1. 3 Topics
Each student has to present a research and part of his thesis.
Opened list of cutting-edge topics:
|_.Topic|_.Link|_.Reporter|_.Scheduled|
|MXNet DL framework|Chen T. et al. Mxnet: A flexible and efficient machine learning library for heterogeneous distributed systems //arXiv preprint arXiv:1512.01274. – 2015. URL: https://arxiv.org/pdf/1512.01274|?|?|
|Manifold MixUp| Manifold Mixup: Better Representations by Interpolating Hidden States. URL: https://arxiv.org/pdf/1806.05236v4 |-| |
|SphereFace| SphereFace: Deep Hypersphere Embedding for Face Recognition URL: https://arxiv.org/pdf/1704.08063.pdf|-| |
|UMAP | McInnes, Leland and John Healy (2018). “UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction”. In: ArXiv e-prints. arXiv: "1802.03426 [stat.ML]":http://arxiv.org/abs/1802.03426|||
| DisCoCat model| Grefenstette E. Category-theoretic quantitative compositional distributional models of natural language semantics //arXiv preprint arXiv:1311.1539. – 2013. URL: https://arxiv.org/abs/1311.1539|*Elizaveta Tagirova*|Feb-28|
|DisCoCat toy model|Gogioso S. A Corpus-based Toy Model for DisCoCat //arXiv preprint arXiv:1605.04013. – 2016. URL: https://arxiv.org/pdf/1605.04013.pdf|*Anik Chakrabarthy*|Mar-14|
|Triplet Loss|https://arxiv.org/pdf/1503.03832.pdf|*Alexandra Luchkina*|Mar-7|
|Style transfer SotA (state-of-the-art)| A Style-Based Generator Architecture for Generative Adversarial Networks. URL: https://arxiv.org/abs/1812.04948 |*Klim Markelov* | Mar-14|
|Pixel Recurrent Neural Networks | Oord A., Kalchbrenner N., Kavukcuoglu K. Pixel recurrent neural networks //arXiv preprint arXiv:1601.06759. – 2016. URL:https://arxiv.org/abs/1601.06759|*Omid Razizadeh*|Mar-14|
|Performance of Word Embeddings|review and experience|*Andrey Zubkov*|Mar-7|
|CosFace|Wang H. et al. Cosface: Large margin cosine loss for deep face recognition //Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. – 2018. – С. 5265-5274. URL: https://arxiv.org/pdf/1801.09414.pdf|*Akilesh Sivaswamy*|Mar-7|
|CNN for speech command recognition. Review||*Roman Kozinets*|Apr-4|
|Text clustering | Xu J. et al. Self-taught convolutional neural networks for short text clustering //Neural Networks. – 2017. – Т. 88. – С. 22-31. URL: https://arxiv.org/abs/1701.00185 |*Evgeniy Kurochkin*|Mar-7|
|A Quantum-Theoretic Approach to Distributional Semantics|Blacoe W., Kashefi E., Lapata M. A quantum-theoretic approach to distributional semantics //Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. – 2013. – С. 847-857. URL: http://www.aclweb.org/anthology/N13-1105|*Ivan Rogalsky*|Apr-4|
|State of the art Deep Learning - Intelligent Network Traffic Control Systems|State of the art Deep Learning: Evolving Machine Intelligence Toward Tomorrow's Intelligent Network Traffic Control Systems. URL: https://ieeexplore.ieee.org/document/7932863 |*Ravi Kumar*|Mar-28|
|Solving the Quantum Many-Body problem with ANN| Carleo G., Troyer M. Solving the quantum many-body problem with artificial neural networks //Science. – 2017. – Т. 355. – №. 6325. – С. 602-606. URL: https://arxiv.org/pdf/1606.02318|*Munjaradzi Njera*|Mar-28|
|Predicting Oil Movement in a development System Using Deep Latent Dynamic Models|URL: Video: https://www.youtube.com/watch?v=N3iV-F4aqLA? Slieds: https://bayesgroup.github.io/bmml_sem/2018/Temirchev_Metamodelling.pdf|*Jetina Tsvaki*|Apr-18|
|Artificial Intelligence Safety and Cybersecurity: a Timeline of AI Failures| Yampolskiy R. V., Spellchecker M. S. Artificial Intelligence Safety and Cybersecurity: A Timeline of AI Failures (2016) //arXiv preprint arXiv:1610.07997. URL: https://arxiv.org/abs/1610.07997|*Thomas Vialars*|?|
|Review on master topic||*Seth Gyamerah*|?|
|Using ML for Network Intrusion Detection|Sommer R., Paxson V. Outside the closed world: On using machine learning for network intrusion detection //2010 IEEE symposium on security and privacy. – IEEE, 2010. – С. 305-316. URL: https://www.computer.org/csdl/proceedings/sp/2010/6894/00/05504793.pdf|*Dylan Bersans*|Apr-18|
|Tacotron 2|Shen J. et al. Natural tts synthesis by conditioning wavenet on mel spectrogram predictions //2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). – IEEE, 2018. – С. 4779-4783. URL: https://arxiv.org/pdf/1712.05884.pdf|*Leyuan Sheng*|Apr-11|
|BERT (Google)|Devlin J. et al. Bert: Pre-training of deep bidirectional transformers for language understanding //arXiv preprint arXiv:1810.04805. – 2018. URL: https://arxiv.org/abs/1810.04805|||
|Universal Sentence Encoder|Cer D. et al. Universal sentence encoder //arXiv preprint arXiv:1803.11175. – 2018. URL: https://arxiv.org/pdf/1803.11175.pdf|||
|ULMFiT|Howard J., Ruder S. Universal language model fine-tuning for text classification //arXiv preprint arXiv:1801.06146. – 2018. URL: https://arxiv.org/pdf/1801.06146.pdf|*Juan Pinzon*|Mar-21| https://arxiv.org/pdf/1801.06146.pdf|||
|ELMo|Peters M. E. et al. Deep contextualized word representations //arXiv preprint arXiv:1802.05365. – 2018. URL: http://www.aclweb.org/anthology/N18-1202|||
|Skip-thoughts, Infersent, RandSent - Facebook|1. Kiros R. et al. Skip-thought vectors //Advances in neural information processing systems. – 2015. – С. 3294-3302. URL: https://arxiv.org/pdf/1506.06726.pdf
2. Conneau A. et al. Supervised learning of universal sentence representations from natural language inference data //arXiv preprint arXiv:1705.02364. – 2017. URL: https://arxiv.org/abs/1705.02364
3. Wieting J., Kiela D. No Training Required: Exploring Random Encoders for Sentence Classification //arXiv preprint arXiv:1901.10444. – 2019. URL: https://arxiv.org/pdf/1901.10444.pdf|||
h1. 4 Topics of master thesis
Opened list of reports on master thesis (statement of work, review, and results):
|_.Reporter|_.Topic|_.Scheduled|
|\3=.1st year students|
|1 Chakrabarti Anik | |May-16|
|2 Razizadeh Omid | |May-16|
|3 Siyoto Owen | ||
|4 Munyaradzi Njera | |May-23|
|5 Averyanov Evgeniy | ||
|6 Kozinets Roman | |May-16|
|7 Melnikov Arsentiy | ||
|8 Rogalsky Ivan |Open System Categorical Quantum Semantics in NLP (master thesis, review) |Apr-18|
|9 Urynbassarov Mukhtar | ||
|10 Yakovenko Olga | ||
|11 Tagirova Elizaveta | ||
|12 Tsvaki Jetina | |May-16|
|13 Ravi Kumar | |Apr-18|
|\3=.2st year students|
|1 Leyuan Sheng | |Apr-25 |
|2 Akilesh Sivaswamy | |Apr-4 |
|3 Juan Fernando Pinzon Correa | |Apr-11 |
|4 Gyamerah Seth | | |
|5 Fishman Daniil | | |
|6 Gusev Petr | | |
|7 Kurochkin Evgeniy | |Apr-25 |
|8 Luchkina Anastasia | |Apr-25 |
|9 Malysheva Anastasia | | |
|10 Sergeev Artem | | |
|11 Zubkov Andrey | |Apr-25 |
|12 Poteshkin Vitaly | | |
|13 Markelov Klim | |Apr-11 |
|14 Tussupova Madina | | |
|15 Marinov Andrey | | |
h1. 5 At fault
These students still didn't selected a paper to report or doesn't assigned to a time slot:
1st year students
* *Owen Siyoto*: refer, master
* *Evgeniy Averyanov*: refer, master
* *Melnikov Arsentiy*: refer, master
* *Mukhtar Urynbassarov*: refer, master
* *Olga Yakovenko*: refer, master
* *Elizaveta Tagirova*: master
* *Thomas Vialars*: refer - time slot
2nd year students
* *Fishman Daniil*: refer, master
* *Gusev Petr*: refer, master
* *Malysheva Anastasia*: refer, master
* *Poteshkin Vitaly*: refer, master
* *Tussupova Madina*: refer, master
* *Marinov Andrey*: refer, master
* *Gyamerah Seth*: refer - timeslot, master