Related projects
Discover more projects across a range of sectors and discipline — from AI to cleantech to social innovation.
The pre-trained Bi-directional Encoder Representation from Transformers (BERT) model had proven to be a milestone in the field of Neural Machine Translation, achieving new state-of-the-art performances on many tasks in the field of Natural Language Processing. Despite its success, it has been noticed that there are still a lot of room for improvement, both in terms of training efficiency and structural design. The proposed research project would explore the detailed design decision of BERT on many levels, and optimize them wherever possible. The expected result would be an improved language model that achieves higher performance on NLP tasks while using less computational resources.
Jimmy Ba
Xiaoshi Huang
Layer 6 AI
Computer science
Accelerate
Discover more projects across a range of sectors and discipline — from AI to cleantech to social innovation.
Find the perfect opportunity to put your academic skills and knowledge into practice!
Find ProjectsThe strong support from governments across Canada, international partners, universities, colleges, companies, and community organizations has enabled Mitacs to focus on the core idea that talent and partnerships power innovation — and innovation creates a better future.