This study explores fractional calculus' applications in neural networks, adjusting activation and loss functions by fractional derivative order. Evaluations on CIFAR-10 and CIFAR-100LT, ImageNet, and UKTFaces highlight practical impacts. Additionally, time, computational, and memory complexities are discussed.