Tasmay Malla’s Post
More Relevant Posts
-
#TASK-3# 🚀 Excited to Share My Latest Project! 🚀 I'm thrilled to present my recent work on the Titanic Survival Prediction task as part of the Encryptix Data Science Internship. In this video, I dive into the data, explore the features, and build a predictive model to determine the likelihood of survival for passengers aboard the Titanic. This project has been an incredible learning experience, and I'm excited to share my insights and findings with you all. 🔍 Key Highlights: Data cleaning and preprocessing Feature engineering and selection Model building and evaluation Key insights and takeaways Check out the video to see the process and results! 📽️ #DataScience #MachineLearning #TitanicPrediction #Encryptix #PredictiveModeling #DataAnalysis #Python #LinkedInLearning
To view or add a comment, sign in
-
Hello Connections! In Task 1 of my Data Science internship at CodeAlpha, I worked on Titanic survival classification. The goal was to analyze the historical data of Titanic passengers and develop a model to predict whether a passenger survived or not based on various features like age, class, and gender. Utilizing machine learning algorithms and techniques, I gained insights into the factors influencing survival rates. The project provided a practical understanding of data preprocessing, feature engineering, and model evaluation—a crucial step in enhancing my skills as a data scientist. #DataScience #TitanicClassification #CodeAlpha 📈💻
To view or add a comment, sign in
-
Pursuing Data Science At New Arts Commerce & Science College Ahmednagar| Python| SQL| Advance Excel| MongoDB| Artificial Intelligence| Industry 4.0| Machine Learning|
🚢 Titanic Survival Prediction: A Classic Data Science Project 🚢 I'm excited to share my first project during my internship at #CodSoft! I built a model to predict whether a passenger on the Titanic survived or not using the famous Titanic dataset. This project is a great introduction to data science and machine learning. 🔍 Project Highlights: 1. Data Collection: Utilized the Titanic dataset, which includes details such as age, gender, ticket class, fare, cabin, and survival status of each passenger. 2. Data Cleaning and Preprocessing: Cleaned and processed the data using Python libraries like Pandas and Numpy to ensure accuracy. 3. Exploratory Data Analysis (EDA): Conducted EDA with seaborn and matplotlib to uncover patterns and correlations within the data. 4. Model Building: Implemented various machine learning models to predict survival outcomes and evaluated their performance. This project not only enhanced my technical skills but also provided valuable insights into the factors affecting survival rates on the Titanic. Check out my work on GitHub:https://lnkd.in/dgvChSzg #DataScience #MachineLearning #Python #Titanic #DataAnalysis #CodSoft #Internship #Tech #LearningJourney
To view or add a comment, sign in
-
🚢 Excited to Share My First Task Completion in the CodSoft Internship: Titanic Survival Prediction! 🚢 I've just completed a project where I used the Titanic dataset to build a predictive model that determines whether a passenger on the Titanic survived. This is a classic project that provides a great opportunity to explore data science techniques with a well-known dataset. 🔍 Project Details: Dataset Features: Age, gender, ticket class, fare, cabin, and survival status. IDE: Google Colab Tools & Libraries: NumPy, Pandas, Matplotlib, Seaborn, Scikit-Learn Model Used: Logistic Regression Accuracy: 78.21% 📊 Key Learnings: Data preprocessing and visualization Applying machine learning algorithms Evaluating model performance Check out my video below to see the code in action! Looking forward to hearing your thoughts and feedback! #DataScience #MachineLearning #TitanicDataset #Python #AI #PredictiveModeling #codsoft
To view or add a comment, sign in
-
🚀 Excited to Share My Recent Project on Image Classification! 🐱🐶 As part of my internship at ProdigyInfo Tech, I recently completed a project involving the classification of cat and dog images using machine learning models. Here’s a quick overview of what I achieved: 🔍 Project Summary: Objective: Classify images of cats and dogs from the Kaggle dataset. Models Used: Support Vector Machine (SVM) and K-Nearest Neighbors (KNN). 🛠 Technical Details: Data Preparation: Images were resized to 128x128 pixels and flattened into vectors. SVM Model: Kernel: Linear Class Weight: Balanced Gamma: Scale Accuracy: 52.50% KNN Model: Number of Neighbors: 5 Accuracy: 49.64% 📊 Results: The models were tested on new images, and both SVM and KNN provided accurate predictions for cats and dogs. 🎉 Key Takeaways: This project was a fantastic opportunity to apply machine learning techniques to a real-world problem. I gained valuable experience in image preprocessing, model training, and evaluation. If you’re interested in the technical details, feel free to reach out or check out the code snippet below. #MachineLearning #DataScience #ArtificialIntelligence #Internship #Tech #ImageClassification #SVM #KNN #Python #ProdigyInfoTech Git hub link:-https://lnkd.in/gejkysr7
To view or add a comment, sign in
-
🚀 Excited to Share My Progress! 🚀 I’ve just completed my second task in the TechnoHacks EduTech Official Data Science Internship — "Fraud Transaction Detection." 🕵️♂️💻 🔍 During this task, I worked on developing a machine learning model to detect fraudulent transactions, a critical real-world application of data science. I utilized techniques like data preprocessing, feature engineering, and algorithm tuning to achieve reliable and accurate results. On to the next challenge! 🚀 #DataScience #FraudDetection #MachineLearning #Technohacks #AI #InternshipExperience #Python #Innovation #LearningJourney
To view or add a comment, sign in
-
Movie Rating Prediction Model🎬 I am incredibly excited to share my second project I've been passionately working on during my data science internship at Encryptix. 🚀 In this video, you'll see a movie rating prediction model that I've developed, which uses machine learning to predict ratings based on various features. 📊 This project has been a great learning experience! I learned a lot about data preprocessing and different machine learning algorithms. I got hands-on experience, which was very valuable. Tuning hyperparameters and making visualizations helped me improve my technical skills and my ability to explain complex data clearly.Exploring different machine learning algorithms was a fascinating experience. I enjoyed comparing their performance and understanding the scenarios. Creating interactive visualizations was both fun and educational. #DataScience #DataVisualization #DataAnalysis #InternshipJourney #Python
To view or add a comment, sign in
-
🚀Hello #connections!!!!🚀 I'm thrilled to share that I've just completed Task No. 1 of my Data Science Internship with CodSoft: TITANIC SURVIVAL PREDICTION! 🌊🛳️ In this project, I delved into the historical Titanic dataset to build a machine learning model that predicts whether a passenger on the Titanic survived or not. The dataset includes features such as age, gender, ticket class, and fare, all of which were crucial in training the model. 📊📈 Task 1: Titanic Survival Prediction IDE: Jupyter Notebook GitHub: https://lnkd.in/gARCmAw8 This classic project was an exciting opportunity to strengthen my skills in data analysis, feature engineering, and model building. I'm eager to tackle the next challenges and continue exploring the fascinating world of data science! 🔍💻 #DataScience #Internship #CODSOFT #TitanicSurvivalPrediction #MachineLearning #Python #JupyterNotebook #ExplorationInTech
To view or add a comment, sign in
-
Aspiring Data Scientist | SRM University Student | Machine Learning & Data Analytics Enthusiast || SQL || Power BI
#TASK 1 - TITANIC SURVIVAL PREDICTION Hello LinkedIn Community, I’m excited to share a new project that I’ve been working on—an in-depth exploration of the Titanic dataset through a dynamic data model! 🎥🔍 This is my first task during the internship.which I have received from #Encryptix In this video, I walk you through: 🔹 Data Overview: An introduction to the Titanic dataset and its key features. 🔹 Model Building: A step-by-step process of creating a predictive model, including data preprocessing, feature engineering, and model selection. 🔹 Results & Insights: Analysis of model performance and the insights derived from the data. Libraries which I have used here :- ° Pandas ° Seaborn ° Numpy ° Matplotlib ° Sklearn #DataScience #MachineLearning #TitanicDataset #PredictiveModeling #DataAnalysis #LinkedInLearning
To view or add a comment, sign in
-
I'm excited to share that I've been working on a project to implement support vector machine (SVM) to classify images of cats and dogs from the kaggle datasets. Project Steps: Data Analysis: Explore and understand patterns within the Kaggle dataset containing cat and dog images. Data Preprocessing: Perform essential preprocessing tasks, including image resizing, normalization, and dataset splitting for training and testing. Feature Extraction: Extract relevant features from the images to serve as input for the SVM model. Model Implementation: Design and train an SVM model specifically for image classification, utilizing the selected features. Model Evaluation: Evaluate the model’s performance using separate test data. Assess key metrics such as accuracy, precision, recall, and F1 score. Fine-Tuning: Explore hyperparameter tuning to optimize the SVM model for better classification results. Key Takeaways: Image Classification: Gain practical experience in creating image classification models using machine learning techniques. Support Vector Machine (SVM): Understand how SVM efficiently performs binary classification tasks. Data Preprocessing: Enhance skills in preparing image data for machine learning, including resizing and normalization. Model Evaluation: Learn to interpret evaluation metrics for image classification tasks. Kaggle Dataset: Gain insights into real-world data challenges and best practices when constructing datasets. This project was completed during a machine learning internship at Prodigy Infotech. source code is available on GitHub and also i had posted the screen recording on the source code link:[https://lnkd.in/dMVZtEYP] Prodigy InfoTech #Prodigyinfotech #Machinelearning #SVM
To view or add a comment, sign in
Ready to work
3moWishing you the best