Titanic Dataset - Building a Neural Network with PyTorch + Testing for Overfitting
Reference
Ref: A Detailed Explanation of the Titanic Dataset Structure
Introduction
Recently, I enrolled in an AI course that included an assignment to build a Neural Network and use the Titanic Dataset for training. The task was to implement overfitting by increasing hidden layers and neurons and then mitigate overfitting using dropout or other methods.
This article documents the process of completing the assignment.
Environment Setup and Assignment Requirements
Environment Setup:
Python 3. ...
CIFAR10 Dataset - Using Pytorch to build CNN + activate GPU + output the result to TensorBoard
Introduction
I recently enrolled in an AI course, and this is the third assignment. It mainly refers to the following websites:
Teaching how to build a CNN using PyTorch: Pytorch Tutorial
Teaching how to use TensorBoard with PyTorch: Pytorch TensorBoard Tutorial
Tutorial on using TensorBoard in CoLab: TensorBoard in CoLab Tutorial
The main purpose of this article is to understand CNNs, try to build a deeper network, use GPU to improve efficiency, and finally display the results of Loss and mis ...
Twitter Dataset - Using LSTM to predict the emotion of the article
Introduction
Recently, I took an AI course, and this is the sixth assignment. The main content taught includes the following topics:
Learn to use LSTM
Use SpaCy
Homework Requirements
Train a text classification on the TweetEval emotion recognition dataset using LSTMs and GRUs.
Build an LSTM model: Follow the example described here. Use the same architecture, but:
only use the last output of the LSTM in the loss function
use an embedding dim of 128
use a hidden dim of 256.
Use SpaCy to spli ...
COCO Dataset - use Faster RCNN + MobileNet to conduct Object Detection
Introduction
Recently I took an AI course, the main content is the following topics:
Learn about Coco dataset
User pre-trained version of Faster R-CNN to predict the bounding box
Calculate IoU
Homework Requirement
Download the Coco Collection*: download the files “2017 Val images [5/1GB]” and “2017 Train/Val annotations [241MB]” from the Coco page.
Download from Coco page. You can load them into your notebook using the pycocotools library.
Randomly select ten from the dataset: 10 images are r ...