Fawazzx commited on
Commit
cc964a8
1 Parent(s): a8045d9
Files changed (1) hide show
  1. README +71 -0
README ADDED
@@ -0,0 +1,71 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Fine-Tuning ResNet50 for Alzheimer's MRI Classification
2
+
3
+ This repository contains a Jupyter Notebook for fine-tuning a ResNet50 model to classify Alzheimer's disease stages from MRI images. The notebook uses PyTorch and the dataset is loaded from the Hugging Face Datasets library.
4
+
5
+ ## Table of Contents
6
+ - [Introduction](#introduction)
7
+ - [Dataset](#dataset)
8
+ - [Model Architecture](#model-architecture)
9
+ - [Setup](#setup)
10
+ - [Training](#training)
11
+ - [Evaluation](#evaluation)
12
+ - [Usage](#usage)
13
+ - [Results](#results)
14
+ - [Contributing](#contributing)
15
+ - [License](#license)
16
+
17
+ ## Introduction
18
+ This notebook fine-tunes a pre-trained ResNet50 model to classify MRI images into one of four stages of Alzheimer's disease:
19
+ - Mild Demented
20
+ - Moderate Demented
21
+ - Non-Demented
22
+ - Very Mild Demented
23
+
24
+ ## Dataset
25
+ The dataset used is [Falah/Alzheimer_MRI](https://huggingface.co/datasets/Falah/Alzheimer_MRI) from the Hugging Face Datasets library. It consists of MRI images categorized into the four stages of Alzheimer's disease.
26
+
27
+ ## Model Architecture
28
+ The model architecture is based on ResNet50. The final fully connected layer is modified to output predictions for 4 classes.
29
+
30
+ ## Setup
31
+ To run the notebook locally, follow these steps:
32
+
33
+ 1. Clone the repository:
34
+ ```bash
35
+ git clone https://github.com/your_username/alzheimer_mri_classification.git
36
+ cd alzheimer_mri_classification
37
+ ```
38
+
39
+ 2. Install the required dependencies:
40
+ ```bash
41
+ pip install -r requirements.txt
42
+ ```
43
+
44
+ 3. Open the notebook:
45
+ ```bash
46
+ jupyter notebook fine-tuning.ipynb
47
+ ```
48
+
49
+ ## Training
50
+ The notebook includes sections for:
51
+ - Loading and preprocessing the dataset
52
+ - Defining the model architecture
53
+ - Setting up the training loop with a learning rate scheduler and optimizer
54
+ - Training the model for a specified number of epochs
55
+ - Saving the trained model weights
56
+
57
+ ### Example Training Code
58
+ ```python
59
+ # Training loop example
60
+ for epoch in range(num_epochs):
61
+ model.train()
62
+ running_loss = 0.0
63
+ for images, labels in train_loader:
64
+ images, labels = images.to(device), labels.to(device)
65
+ optimizer.zero_grad()
66
+ outputs = model(images)
67
+ loss = criterion(outputs, labels)
68
+ loss.backward()
69
+ optimizer.step()
70
+ running_loss += loss.item()
71
+ print(f"Epoch [{epoch+1}/{num_epochs}], Loss: {running_loss/len(train_loader)}")