Hi, I'm Vika,
AI/NLP reseacher and post-reseacher in High Energy Phisics. I've completed my PhD degree "Computationally Efficient Natural Language Processing Methods using Tensor Representations" in Skolkovo Institute of Science and Technology, under the supervision of Alexander Panchenko.
Research
The topic of my current interest is splitted into three directions:- Comparative Question Answering: provide Question-Answering system under the domain of comparative question answering. Namely, we investigate the retriveal-based pipelines to argumentatively support users under the desicion between two similar options.
- Efficient Transformers: Nowadays Transformer architecture (GPT-2,3,4) looks overparameterized, heavy to employ and cost-extensive. In this area we research compressing the model without quality drop and exploring ways of efficient task-oriented fine-tuning of Big Language Models. Up to now, we provide tensor-based TTM layers, which correspond in functionality to the FC layers, but contain fewer parameters.
- Efficient Knowledge Embedding: By modifying Canoncal Polyadic Decomposition Algorithm, make gradient in Knowledge Graph Embedding model be computed analitically. It reduces up to 2 times memory, needed for processing big Knowledge Graph.