The FF-influenced Algorithm for the Classification of Large-scale Tasks

Document Type

Presentation

Location

COAS: Math Conference Room

Start Date

18-4-2025 11:45 AM

End Date

18-4-2025 12:10 PM

Description

Recent advancements in machine learning are exploring alternatives to backpropagation, including the forward-forward (FF) algorithm introduced by Geoffrey Hinton. These two forward passes were conducted: one used positive (real) data, and the other employed negative data generated by the FF algorithm for several small problems. Learning from the FF algorithm, we propose to enhance the goodness of positive data and reduce the low goodness of negative data for classifying images in both the MNIST and CIFAR-10 datasets. We compared the FF-based network with standard backpropagation-based feed-forward neural networks, focusing on the classification of images with accuracy and loss. We show that the FF algorithm achieves comparable accuracy to traditional feed-forward neural networks on the CIFAR-10 dataset, suggesting the feasibility of the FF algorithm for large-scale tasks.

This is a joint work with Brady Heddon, Adam Kuzminski, Kaitlyn Cavanaugh, Jack Dillard, and Sirani M. Perera.

Share

COinS
 
Apr 18th, 11:45 AM Apr 18th, 12:10 PM

The FF-influenced Algorithm for the Classification of Large-scale Tasks

COAS: Math Conference Room

Recent advancements in machine learning are exploring alternatives to backpropagation, including the forward-forward (FF) algorithm introduced by Geoffrey Hinton. These two forward passes were conducted: one used positive (real) data, and the other employed negative data generated by the FF algorithm for several small problems. Learning from the FF algorithm, we propose to enhance the goodness of positive data and reduce the low goodness of negative data for classifying images in both the MNIST and CIFAR-10 datasets. We compared the FF-based network with standard backpropagation-based feed-forward neural networks, focusing on the classification of images with accuracy and loss. We show that the FF algorithm achieves comparable accuracy to traditional feed-forward neural networks on the CIFAR-10 dataset, suggesting the feasibility of the FF algorithm for large-scale tasks.

This is a joint work with Brady Heddon, Adam Kuzminski, Kaitlyn Cavanaugh, Jack Dillard, and Sirani M. Perera.