I'm interested in data science, machine learning, and deep learning focused on Natural Language Processing(NLP). Area of my research is about Large Language Models.
We developed an agentic stage-based LLM framework that guides conversations through five counseling stages, drawing from Person-Centered Therapy and Acceptance and Commitment Therapy. Our system uses three specialized types of agents: stage based agent for each framework stage, another one approach selection agent selects appropriate counseling approaches, and monitoring agent manages stage transitions. The framework achieved a 79% positive user reaction rate, significantly outperforming baselines. Real user testing and evaluation by counseling practitioners confirmed improvements across seven of eight mental health support metrics, demonstrating potential for scalable LLM-based mental health support in Thailand.
This publication is about applying a function in the MATLAB program and knowledge of Data Science to develop an artificial neural network model for predicting hydrogen sulfide solubility in natural gas purification processes. The model obtained a coefficient of determination (R2) of 0.9817 and a mean square error (MSE) of 0.0014.
LLM based mental health support chatbot in Thai. Currently, Dmind Chatbot is continue researching and developing under Center of Excellence in Digital and AI for Mental Health (AIMET) Radchaneeporn Changpun, Naphat Khoprasertthaworn, Pipat Jongpipatchai, Theerin Petcharat, Krittapas Rungsimontuchat
Design initial architecture for chatbot framework
Research and design multi-turn chatbot workflow LLMs with mental health support knowledge
Coordinate with domain experts to design and improve the chatbot to meet domain practices
Conduct experiments to find optimal techniques for the chatbot
Led research and developer teams for the first phase of the project
Led interns in designing and deploying the chatbot for the first phase of the project
I classify Scopus publications using encoder representation from transformers language model (RoBERTa), achieving a significant improvement of 40.3% in the Macro F1 Score (0.6687) compared to the baseline model (0.1894), demonstrating the effectiveness of transfer learning in enhancing text classification performance
Implemented data preprocessing techniques, including tokenization, encoding, and data splitting, to prepare the dataset for training and evaluation
Designed and developed a custom RoBERTa-based neural network architecture for multi-label classification, incorporating dropout regularization and a linear classification layer
Utilized PyTorch and Hugging Face libraries to efficiently train and evaluate the model, leveraging GPU acceleration for improved performance
This group project conducted an empirical study comparing LLM techniques (agentic RAG, Naive RAG, Long Context LLM, Vanilla inference) for answering Thai Personal Income Tax (PIT) questions, measuring performance with automatic NLP metrics (BERT score, BLEU, ROUGE-L), LLM-as-a-judge, and qualitative analysis
Preprocess Data (ความรู้ภาษีเงินได้บุคคลธรรมดา จากกรมสรรพากร: https://www.rd.go.th/62337.html) using OCR and Web scraping
I developed a RAG technique to improve the hallucination of Llama2-13B using the vector database created from Scopus publications
Preprocessed the dataset for semantic indexing and utilized embedding model to generate semantic embeddings for the dataset
Integrated Pinecone, a vector database, to store and efficiently retrieve relevant context
Employed Meta's Llama-2-13b-chat-hf as the backbone for generating response and leveraged the Llama 2 tokenizer to preprocess and tokenize user queries and context for input to the LLM
Utilized the Langchain framework to streamline the integration of the LLM, vector database, and embedding model
Developed a RAG pipeline by combining the LLM with the retrieved context to generate accurate and contextually relevant responses to user queries, enabling the LLM to generate coherent and meaningful responses based on the provided context
My Blog
CS Graduate Degree Reflection
My blog about my experience and lessons learned during my computer science graduate degree at Chulalongkorn University, Bangkok, Thailand.
Read My Blog