By analyzing the Titanic dataset and employing advanced algorithms, a proficient data analyst developed a robust predictive model to determine passengers' likelihood of survival. This project showcases the power of data-driven insights in historical contexts and underscores the effectiveness of leveraging advanced analytics for practical applications. #DataAnalytics#PredictiveModeling#HistoricalContext#AdvancedAnalyticsCodSoft
This is my analysis of the Titanic and Unicorn data, and I was able to do the following
1. Connect powerbi to a dataset
2. Visualize data on powerbi
Also, these questions were answered for the unicorn data
1. What is the total valuation
2. What are the total number of companies
3. What are the total number of industries
4. What are the total number of cities
5. What are the total number of countries
6. What is the valuation trend by year
7. Top 10 countries with the most number of unicorns
8. Top 10 most valued industries
9. Top 10 most valued companies
For the titanic data, the following questions were answered
1. What is the total number of passengers on board
2. What is the total number of male and female
3. What is the average age of passengers
4. What is the total number of survival
5. What is the total number of death
6. What is the average fare
7. Passengers distribution by port
8. Passengers distribution by ticket class
Quantum Analytics NGJonathan OsagieSegun Umoru
Technical Lead @Esri | Geodata Viz Advocate | Fabric Enthusiast
We live in an era where data is readily available at our fingertips. This has led to overwhelming changes, but one thing is sure: location analytics can help us put all this data in context. By integrating external and internal data sources with our ML models, we can better understand how to make critical decisions. I have created a dashboard showcasing how a hospital or business can leverage Esri consumer spending forecasting. #mapping#ML#dataviz#GISEste Geraghty, MD, MS, MPH, CPH, GISPKymberli FieuxMaureen Graney
Please take a look and let me know your thoughts.
https://lnkd.in/gw4a7Jbh
Linear models play a crucial role in data analysis by determining if it's feasible to predict the target variable using available features. This exploration is often referred to as detecting a signal within the data. Assessing the error of a linear model against more intricate models can aid in the decision making process of opting for complexity. #DataAnalysis#LinearModels#PredictiveAnalytics
Everything our senses gather is transformed, deep inside our minds, into simple, manageable representations, or symbols. (Terrence W. Deacon) #dataviz#visualization#data
Over recent years I have spent a lot of time visualising data to answer questions, explore possibilities, explain situations in context, and to simply be able to communicate answers to questions. I really like the David McCandless diagram on what makes a good data visualisation, its a good checklist.
My visualisation and analysis tool of choice is SightXR. It puts data into context, relationships, linkages and areas of interest become clear. Its a good way to make better sense of data and information.
#Informationisbeautiful#datavisualization#data#SightXR
🧐 A great graphical representation of a core principle of statistical analysis.
Causation and correlation are not synonymous data interpretations. Correlations are a common occurrence in data, and while they are helpful guides for further research and experimentation, they alone can not provide actionable, evidence-based recommendations. Two events happening in parallel do not explain a cause-effect (driver analysis) like causal capabilities can provide.
Remember to look for driver analysis, but if statistical analytics can’t yield a reliable identification of driver(s), know that correlation can’t be viewed as a “runner up” in defining reliable action strategy or diagnosis of data variable relationships as a stand alone outcome. Correlation indicates further inquiry is needed before asserting a conclusion.
#researchimpact#researchmethods#statisticalanalysis
Hi Everyone
Overview
Here is my 12th project which I recently designed with Quantum Analytics NG showing an audit of airplane crashes and fatalities in the United States from 1908 to 2009.
Visualization tool: Power BI
Here's a summary based on the key data points:
Key KPI:
- Total Number of Flights: 714
- Total Fatalities: 105,000
- Grounded Flights: 8,440
- Number of Operators: 2,470
- *Number of Routes Covered*: 3,244
Top 7 Fatalities by Airplane:
These involve mostly Zeppelin airships, including:
1. Zeppelin LZ-129: 35 fatalities
2. Zeppelin L-43: 24 fatalities
3. Zeppelin L-59: 23 fatalities
4. Zeppelin L-70: 22 fatalities
5. Zeppelin L-8: 21 fatalities
6. Zeppelin L-53: 19 fatalities
7. Zeppelin L-44: 18 fatalities
Top Fatalities by Location:
- Zurich, Switzerland: 105 fatalities
- Zawoja, Poland: 53 fatalities
- Zavn, Mongolia: 40 fatalities
- Zelez, Russia: 33 fatalities
- Zarate, Argentina: 28 fatalities
Fatalities by Month:
- November: 10.8K fatalities
- March: 10.4K fatalities
- August: 10.3K fatalities
- December: 10.3K fatalities
- January: 9.4K fatalities
My Recommendation:
To reduce fatalities recommendations would include focusing on:
- Safer aircraft designs: The transition from airships to airplanes significantly improved safety. Continuing to invest in the latest technology and aircraft designs remains critical.
- Improving route safety: Given that certain routes and regions had higher fatality rates, investing in better navigation systems and weather tracking, particularly in areas prone to accidents, could lower risks.
Thank you.
I'm grateful to Jonathan Osagie, Godswill Ojumeaka, and Quantum Analytics NG for their help.