Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml
and set future: false
.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Short description of portfolio item number 1
Short description of portfolio item number 2
Published in N/A, 2019
Developed a novel method to estimate the circumference of an ellipse and π by exactly calculating the perimeter of stretched regular polygons.
Published in N/A, 2022
Fine-tuned embeddings and BERT-style models for tasks such as text clustering, sentiment analysis, and named entity recognition on hospital reports.
Published in N/A, 2023
Trained a multi-head attention vision architecture to improve performance on the classification of FAST ultrasound exams given limited training data.
Published in Empirical Methods in Natural Language Processing (EMNLP), 2023
This paper discusses the distillation of long-context, efficient attention BERT-based models to yield models that are smaller, faster, and cheaper to deploy.
Recommended citation: Brown, Nathan and Williamson, Ashton and Anderson, Tahj and Lawrence, Logan. (2023). "Efficient Transformer Knowledge Distillation: A Performance Review" Empirical Methods in Natural Language Processing. https://arxiv.org/pdf/2311.13657.pdf
Published in arXiv, 2024
Developed in partnership with the University of Pretoria's DSFSI, this paper discusses the development of the Bilingual Open Tswana Suite of Langauge Models (BOTS-LM), a suite of LLMs trained for Setswana and English.
Recommended citation: Brown, Nathan and Marivate, Vukosi (2024). "BOTS-LM: Training Large Language Models for Setswana" arXiv https://arxiv.org/abs/2408.02239
Published:
This is a description of your talk, which is a markdown files that can be all markdown-ified like any other post. Yay markdown!
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
Undergraduate course, University 1, Department, 2014
This is a description of a teaching experience. You can use markdown like any other post.
Workshop, University 1, Department, 2015
This is a description of a teaching experience. You can use markdown like any other post.