I am Swapnil Patel

Python Developer,Data Scientist,Backend Developer,Machine Learnning,Deep Learnning

Name: Swapnil Patel

Profile: Python developer

Email: patelswapnil2308@gmail.com

Phone: 9313398797

Skill

Python 90%
Machine Learnning 85%
Deep Learnning 90%
Data Science 80%
JavaScript 75%
C 50%
JAVA 40%
About me

I am a dedicated and enthusiastic Computer Science student with a strong passion for technology and problem-solving. With a solid foundation in programming and a keen interest in emerging technologies, I strive to make a positive impact through my work.

Throughout my academic journey, I have gained a comprehensive understanding of various programming languages, algorithms, and data structures. However, my true passion lies in the field of data science. I have developed a deep understanding of Python and its ecosystem, including libraries such as NumPy, Pandas, and Matplotlib.

Outside of my academic pursuits, I actively engage in personal projects and participate in programming competitions. These experiences have sharpened my problem-solving abilities and taught me the importance of perseverance and attention to detail.My goal is to combine my technical expertise with my passion for data science and AI to make a meaningful impact in the industry.

Certification

My Certification in diffrent area.

...
Code In Python 3

This is a wider card with supporting text below as a natural lead-in to additional content. This content is a little bit longer.

Certificate ↗
...
IBM Data Science Course

Complete IBM Data Science Course Online.

Certificate ↗
...
Data Science Internship

Complete Data Science Internship At SKILLVOID IIT ROORKEE 4 week

Certificate ↗

Projects

Here, I shown my best Projects.

Publications

My Reserch Paper

...
A Comparison Between Custom Activation Function With Existing Activation Function

Our study demonstrates that the incorporation of custom activation functions in CNN architectures leads to improved accuracy compared to existing activation functions. The custom activation functions outperformed traditional ones like ReLU, tanh, and Elu on the MNIST digit, Fashion-MNIST, and CIFAR-10 datasets. The results emphasize the potential of custom activation functions in enhancing model performance and suggest the importance of selecting appropriate activation functions based on the dataset and model architecture. Further research in this area can provide valuable insights for improving deep learning models.

Paper ↗