SQL is Structured Query Language. It is the language used to interact with any RDBMS (Relational Database) like Oracle, SQL Server, MySQL, PostgreSQL etc.
SQL training will teach you how to query any database. The most common types of queries are where you want to pick out some rows or records from a table.
A single table SQL query is pretty easy to write. You can qualify what you want to filter out, based on the criteria. Once you execute the query, the results from the database are picked up, and displayed almost instantly. SQL is usually extremely fast.
A multiple table SQL query requires you to understand the relationships between the different tables, the primary and secondary keys of the various tables, and then carefully do the joining of the common fields between the different tables.
Once the common fields are linked between the tables, they essentially form a long flat row, where you can pick anything from any table in the combination. This is the power of SQL. It is the reason almost everyone wants to learn SQL.
Multiple table queries are more useful, and they are akin to writing multiple VLOOKUPs in Excel to get to the Master data. Writing Multiple table queries is where most people get stuck… because of the various ways to join – right join, left join, inner join, outer join etc.
This is where SQL Training comes in handy. Attending a short 1–2 day workshop can give you the fundamentals of SQL – how to insert into a database table, how to delete records, how to modify any entry, and how to retrieve based on any selection criteria.
If you can attend a classroom training, I would recommend that, because you can ask questions on the spot, and a trainer can assist you too. But if that is not feasible, there are ample online SQL trainings too.
SQL is a very simple, business like language. It is like talking to someone in English. Pretty simple, straight forward. And it does not feel like learning a new language. It is pretty intuitive and easy to learn. A two day SQL workshop is more than sufficient to get you going on your journey to pick any data from your corporate databases.
Microsoft Access is Relational Database Management System (RDBMS) from Microsoft that combines the relational Microsoft Jet Database Engine with a graphical user interface and software-development tools to build Forms, Queries and Reports.
This is ideal for getting started in building a Customer Database, Accounting System, Warranty Tracking System, Order & Inventory Tracking etc. easily.
Microsoft Access stores all database tables, queries, forms, reports, macros, and modules in the Access Jet database as a single file.
Developed by: Microsoft Corporation
Latest version: 16.0
Advantages of Microsoft Access
Only one installation needed (RDBMS and design implement in one simple package)
Easy to install
Easy to integrate
Cost wise, it is way much lower. It is possibly already included in your Microsoft license.
Now, coming back to the question, Access applications are still in use longer than 20 years and people are building newer, mission critical applications using Microsoft Access.
Microsoft Access is still a viable tool for personal or small workgroup applications. The small business is not going to spent thousands on a software project that will take a year or more to develop.
Access has always been denigrated by mainline IT people almost from the beginning because it gives people control, which takes that control away from the IT department.
It has multiple uses and is used worldwide. The Danish government uses an Access app for contact tracing of COVID cases.
And in Singapore, Access is truly relevant and used every day by thousands of companies and individuals.
So yes, if you need to develop a database for personal use or within a workgroup, it is still highly relevant.
Do you analyze data? Whether you work in Sales, Finance, Manufacturing, Logistics, Engineering or Customer Service, there is a constant need to analyze data.
With the advent of the industrial age and the conveyor belt, gaining efficiency and reducing errors has been a constant battle. We need more and better ways to analyze data. Today, billions of new data points are gathered every minute. There is more storage available in the cloud, and there is no dearth of computers big and strong enough to process it quickly.
Yet, there is a lack of know how, a lack of understanding on how to analyze data. As a result, more companies are still not able to tap on the promise of Big Data, Artificial Intelligence, Internet of Things or Machine Learning.
To Understand the State of Data Analysis, and look at solutions, we need to understand a few fundamentals about Data, and Data Analysis.
Why Do We Want To Analyze Data?
Of course there are a gazillion reasons.
No… Not really. The need to analyze data is routed in only a few basic reasons.
To Gain Insights about our data. Why?
So that we can find out what works, and what does not. So we can compare the past results with another company, anoder product, another period, and see if there is some insight to glean.
When we find out what works, and find out where the problem lies, we can then perform Root cause analysis – which can tell us the real problem to fix. When we have fever, we can take a tablet of Paracetamol (Tylenol/Panadol/Aspirin). However, the fever is not the disease. It is a symptom that we develop, when something is not right with our body, and the body is fighting the bad bacteria/virus. During this fight, the temperature rises. The Paracetamol can reduce the fever, but it does not remove the fight. To go to the root cause of the problem, a visit to the doctor is required. The doctor may analyze the problem by looking at your Blood pressure, heart rate, x-ray, probes etc. and figure out the issue. If it is indeed bad bacteria causing some area to swell, the doc can issue a dose of Anti-biotics, which fight the bad guys, and treat the real problem. Once the bad guys are gone, the fever automatically comes down to the normal level.
Make Better & Faster Decisions
Based on the insights, we are able to make better decision, and faster too. And backed by data, we can have more confidence on our decision making process… rather than relying only on the gut-feeling.
With statistical methods of analyzing data and forecasting trends, the accuracy of the analysis, and insights gets deeper as we generate better analytics. We can then make a more informed decisions faster.
How Do We Analyze Data: The Traditional Data Analysis Process
The data analysis process is simple, and we have been following it for so long that it has almost become a habit… albeit a not so good habit. What we usually do is outlined below:
Get the Raw Data File From The Source System
This could be a Text File, a Comma Separated Values (CSV) File, a PDF file, an SQL data dump, an Access or RDBMS source (think Oracle, Microsoft SQL Server or MySQL Database)
Load into Excel
Load this data into Excel. This could be easily achieved, but this is a manual process, that has to be repeated every day, week, month etc., whenever the data changes, or new data arrives.
Incoming data is seldom clean. There will be missing values, duplicates, wrong headers, mis-aligned dates in YYYYMMDD or some other odd text or numeric layout, and other data issues to fix.
To do this, you’ll have to write some simple formulas, functions, data cleanup steps, fix dates, and fill blanks or nulls with an appropriate values. This is all manual process, unless automated with the help of macros.
Lookup Master Data
Once the data is cleaned-up, we need to change the codes with values, get the correct product price, employee salary, or the date of manufacture from some master data tables. This often requires expertise with Excel formulas like VLOOKUP & HLOOKUP. You may have to do some interim calculations too, create some calculated columns, and create a wide table with all the fields/columns required for the analysis in the next step.
Create Pivot Tables
To analyze the data, one of the quickest and fastest ways is to create a Pivot Table. Pivot Tables help you to summarize data – create Sum, Count, Differences, Cumulative Totals etc. and split it by the Rows & Columns. Further, you can create multiple Filters or Slices using the Filters or Slicers options.
By default, a Pivot Table does not refresh automatically. You have to either set up this option, or refresh manually. And even if you refresh it either way, it may not pick the additional, new data, for the range of the source data may be set explicitly to an Absolute range. Changing the Absolute range always is an extra step that you have to remember to do.
To present the data, we mostly rely on the popular Bar Charts, Column Charts, Pie Charts, and the newly introduced Map, Funnel or Tree map charts. However, charts often get too busy, and there are not too many options to customize them. Once the charts are ready, you don’t want to send the entire Excel file, with the charts. And sending only the charts is not an option, without the underlying data. This brings us to the next point…
Paste Charts Into PowerPoint
PowerPoint is the darling of the corporate boardroom. No presentation is ever complete without a mega slideshow. So people use Excel to analyze the data, and the paste the charts into PowerPoint. They create hundreds of slides, and then marvel at their slide deck. Finally, they are ready for the big presentation to the directors.
Present to Management using PPT Slides
The PowerPoint show begins, and your audience is thrilled by your analysis, prestation and charts. You are secretly gloating over the success. Suddenly a member of the audience has a question – can you compare this quarter with a quarter 5 years ago?
Well… yes you can. But you don’t have the data right now. You’ll have to get back ot your desk, do the right analysis, and probably show them the chart in your next quarterly meetup.
The opportunity window is so small, by the time you get back to them, they probably do not need the information any longer. You have to cut a sorry figure, and all your great work comes to nothing…
What’s wrong with this whole scenario. Let’s find out.
Problems With Traditional Approach To Data Analysis
There are a number of major problems in the Traditional approach of Data Analysis that has been around for the past 20-30 years… pretty much all along the advent of Spreadsheets. To name a few:
Excel Charts and Reports Are Not Interactive
Traditional Excel or PowerPoint Reports/Charts are not interactive. They are just screen shots. So they don’t change dynamically. Good because it will always be the same, and speed of execution will be fast. Bad because it can’t be used again… And has to be manually refreshed each time.
You Can’t Share Excel Files Easily
People are worried when it comes to sharing their Excel files. They have security, and privacy concerns. And an Excel file is quite fragile. Change of any path, sheet, formula or cell can break the workings completely, rendering it completely useless.
Excel VLookups are Slow & Cumbersome To Use
Excel files rely on Vlookups to pick the correct employee salary, product description attributes, or prices. It works great, and has been the mainstay for Excel’s popularity. However, VLookups recalculate extremely slowly on a linked Excel file, which has anywhere in excess of 200,000 records. At half a million, things absolutely come to a standstill, and many times Excel will give you the dreaded Blue Screen and simply crash.
Excel Takes Time to Clean & Refresh every time
Newly added data has to be cleaned up. It takes time, and is a slow, painstaking process. If you forget to update, or refresh on time, it shows old/wrong data and is a complete waste. New data has to be added, and refreshed each time.
Security of Excel Data is Quite Fragile
Sending Excel reports without the data is simply not possible. And sending it with data raises Security concerns. Any leaks will destroy your pricing, and lose your competitive advantage. And any broken links will render the data useless too.
A Very Small, Finite Limit of Excel Rows
There is only a finite number of rows that can be loaded into Excel. The current limit is a million rows (1,048,576 to be exact.). But in today’s world, a million rows is considered nothing. We need to look at ways to expand the limit to millions of rows.
Pivot Table Limitations
Pivot tables are quite slow to refresh. Further, they do not refresh automatically. And, there are very few ways to visualize the data – only sum, count, average, and YTD etc. It lacks the rich data transformations like Year on Year, Quarter on Quarter Analysis. It is not very good at showing the total overall percentages and its breakdowns.
Wasted Time in Cleanup
80% time is spent to clean the data, and only 20% is left for us to analyze it. This does not accord the time and importance data analysis requires. The equation is completely skewed out. It should be the other way round – 20% to clean, and 80% time for analysis.
Is There A Solution To All These Data Analysis Woes?
With the current state of Business Intelligence Tools, there is more than a glimmer of hope. The industry has finally arrived to a point where Big Data processing is becoming commonplace, and is no longer in the realm of the wealthy or the academics. Today, the common man’s BI tools are already working wonders.
Need for Interactive System
Today, the top of the breed BI tools like ClikView, Tableau and Microsoft, all offer dynamic ways to present data, and interactive ways to visualize it, in numerous ways.
Write Once. Use Again.
Now the steps of cleaning the data can be recorded, and used again and again, each month with hardly any tweaks. This reduces the need for manual cleanup and improves accuracy and reliability. It offers ways to scale up the data analysis process and work on more value added services.
Shift Cleanup vs Analyze Ratio to 20-80.
With the added interactivity and automation, finally more time can be spent in analyzing the data. Cleanup jobs are set in the background, and they can continue to run on auto-pilot. Plus, we are able to process data now in real time, which drastically improves the prospects of having usable, actionable data and insights.
Load Data From Multiple Sources
The ability to load and merge data from multiple sources is becoming much easier, and automation is making this chore into a breeze that is a fun to do activity today.
Data from Text files, CSV, Excel, Databases, XML, Websites, Live Tickers, On and Off site Corporate Data Warehouses is becoming a reality. This opens up new possibilities.
Remove need to do complex, slow VLookups
Forget slow VLookups, and be able to lookup any row, any column, irrespective of the order or distribution of the data rows and columns. This frees up new options and makes merging data a simple matter. In fact, now we do not need to have very wide tables.. We can have shorter (Less wide – less number of column in a table And have taller tables (More data rows)
Load Huge Data Volume, beyond 1 million rows of Excel
Options like Microsoft Power Query & Microsoft Power Pivot enable you to load a few million rows a simple matter. And there is no need to worry about running out of space in Excel or its bandwidth – the ability to process a few million rows.
Microsoft Power BI: The Magic Wand To Vanquish All Your Data Analysis Problems
Handle Big Data With Ease
Hundreds of Data Sources
Build Relationships. End of VLookups
Multiple Ways to Refresh
Share With Ease
New Ways to Visualize – Maps, Tree Maps, Funnels, KPIs, Speedometers
Security of Sharing
Multi Device Support – Web, Desktop, Mobile, Tablet, Without installing any additional software
The workplace has already changed. As technology continues to rapidly transform industries & jobs, staying relevant & competitive requires continuously updating, diversifying, and building completely new skill sets.
Today, Data analysis is no longer the job of IT folks. Everyone is required to analyze the past, adapt to change, and forecast the future. The ability to do so well is increasingly what will keep your job.
Learn Python & Stay Relevant
Python is a great programming language for data science and general data analysis. It is open-source & free to download for anyone, unlike commercial tools like SAS or SPSS. Find out why you must learn Python for your current & future jobs.
Purpose of Learning Python
It is suitable for almost any data science task, from data manipulation and automation to ad-hoc analysis and exploring datasets.
Python is easy to learn, even for complete beginners. You don’t need a background in IT or computer science.
Python is used by people that want to go deeper into data analysis or apply statistical techniques, and by most people who turn to data science.
Python is a production-ready language, meaning it has the capacity to be a single tool that integrates with every part of your workflow!
Why Learn Python: It’s Usability is Great
Whether you work in Manufacturing, Services, Banking, Finance, Logistics, Telecom, Marine, Oil & Gas, Shipping, IT or any other industry, you’ll be able to apply Python to everyday work. And people with a software or engineering background may find Python comes more naturally to them.
Coding and debugging is way easier than other programming languages because of the simple syntax of Python
Python has a robust ecosystem and is commonly considered one of the easier programming languages to read and learn. Its programming syntax is simple and its commands mimic the English language.
Python code is syntactically clear and elegant, easily interpretable, and easy to type.
It’s great for building data science pipelines and machine learning products integrated with web frameworks at scale.
Why Learn Python: It’s a Flexible Language
Python is flexible for creating something that has never been done before.
You can also use it for scripting websites, Clean or Scrape Web Data, Merge data from multiple sources, and create games & other applications easily.
Why Learn Python: It is Extremely Easy To Learn
Python’s focus on readability and simplicity means its learning curve is relatively linear and smooth. With this, you can see the difference as you begin to learn Python. Once you know the basics of Python, you should go for Python For Data Analysis Training in Singapore.
Python is considered a good language for beginners. No wonder it is the Number 1 Programming language in the world.
Advantages of Python Over Other Languages
General-purpose programming languages are useful beyond just data analysis.
Python has gained popularity for its code readability, speed, and many other functionalities too.
It is great for mathematical computation & learning how algorithms work.
Python has high ease of deployment and reproducibility.
Popular Libraries and Packages
pandas to easily manipulate data
SciPy and NumPy for scientific computing
Scikit-learn for machine learning
Matplotlib and seaborn to make graphics & charts
statsmodels to explore data, estimate statistical models, and perform statistical tests and unit tests
Getting Started in Python
There are many Python IDEs to choose from which drastically reduce the overhead of organizing code, output, and notes files.
Jupyter Notebooks and Spyder are 2 such popular IDE that we use to teach Python at Intellisoft.
Learn Python Step-By-Step: Instructor Led Training
Server Load: Being client-side reduces the demand on the website server.
This training is divided into 2 classes – a Beginners 2 Day C Programming Course, where you will learn the Fundamentals of C Programming – Data Types in C, Operators, Arrays, Conditional Statements etc. This will assist you to build a solid foundation in C Programming Language.
Once you have mastered these foundational concepts of C Programming, you will be ready to move on to the More advanced concepts in C language – Pointers, Memory, Manipulating Strings, Arrays, Files, Inputs and outputs…
Some of the Advanced C Language Concepts we cover in our 2 day public classes in Singapore:
* Working with bits: number system, OR, XOR , NOT ,Setting bit flags, Bitwise shift operators, Extracting bit values
* Pointing to data: Getting values via pointers, Passing pointers to function, Arrays of pointers, Pointing to functions
* Manipulating strings: String function, Working with strings
* Building structures: Nested structures, Array & pointers in structures, Array of structures
* Union: Array of union
* Standard input & output file streams: Working with files
* Header Function and Linking File: Function in custom headers, Multiple source files linking files, Static functions.
It is best to take the Basic and Advanced course together – back to back. You can finish it in the same week, and be ready to start coding in C language.
Why not take this opportunity to check out the course outlines, and enroll in the C Language Training Programs. Check out our Training Calendar and Register online for C Programming Courses. You can also call us at +65 6296-2995, or email to us.