Business Analyst interview preparation Welcome to the Business Analyst interview preparation. Today we will cover some basic topics on SQL, Python, Machine learning, and Case studies. Let’s get started with the Business Analyst interview preparation.
Videos to cover for Business Analyst interview preparation:
We are well known for our interview books and have 70+ e-book across Amazon and The Data Monk e-shop page . Following are best-seller combo packs and services that we are providing as of now
YouTube channel covering all the interview-related important topics in SQL, Python, MS Excel, Machine Learning Algorithm, Statistics, and Direct Interview Questions Link –The Data Monk Youtube Channel
Website – ~2000 completed solved Interview questions in SQL, Python, ML, and Case Study Link –The Data Monk website
E-book shop – We have 70+ e-books available on our website and 3 bundles covering 2000+ solved interview questions. Do check it out Link –The Data E-shop Page
Instagram Page – It covers only Most asked Questions and concepts (100+ posts). We have 100+ most asked interview topics explained in simple terms Link –The Data Monk Instagram page
Welcome to the Business Analyst interview questions. Today we will cover some basic topics on SQL, Python, Machine learning and Case Study. Let’s get started with the Business Analyst interview questions.
Videos to cover for Business Analyst interview questions :
We are well known for our interview books and have 70+ e-book across Amazon and The Data Monk e-shop page . Following are best-seller combo packs and services that we are providing as of now
YouTube channel covering all the interview-related important topics in SQL, Python, MS Excel, Machine Learning Algorithm, Statistics, and Direct Interview Questions Link –The Data Monk Youtube Channel
Website – ~2000 completed solved Interview questions in SQL, Python, ML, and Case Study Link –The Data Monk website
E-book shop – We have 70+ e-books available on our website and 3 bundles covering 2000+ solved interview questions. Do check it out Link –The Data E-shop Page
Instagram Page – It covers only Most asked Questions and concepts (100+ posts). We have 100+ most asked interview topics explained in simple terms Link –The Data Monk Instagram page
This article is all about Amazon Leadership Principles for interviews. But why do we need to understand and prepare these principles? First and foremost, when you interview for Amazon, in either all the rounds or in some of the rounds there will be a lot of case studies or questions which need to be answered using the Amazon Leader Principle. Secondly, once you understand these principles you will actually feel that you understand how good or bad you were at solving business problems, what mistakes did you do, and how you figured it out knowingly or unknowingly. Amazon Leadership Principle helps you in understanding your work in a much better way.
Always be prepared with these principles, I will put out questions on these principles and you need to think through and answer these questions. These might seem to be very natural when you are practicing at home but when these questions are asked in an interview then you might end up blabbering 😛
Under no means, I would like to say that these are the only definitions and answers to the questions. I sincerely urge you to write your own answer in the STAR format, give it a shot and we will evaluate the answers 🙂
Amazon Leadership Principle Questions for interview
Amazon 14 Leadership Principles:-
Customer Obsession This world is customer centric, you make an app for customers, you provide services to customers, you do everything for you customer. If you are working in a client facing company then you client is your customer.
My example – I once created an ML model but since it was a high priority task with time crunch so I missed an important variable thus went with half cooked model. How I mended – Looked for more variables, did rigorous EDA and came up with far better results
This is STAR methodology S – What was the Situation ? ML model with a time crunch T – What was the Task at hand? Had to work on with limited resource to predict XYZ with ABC accuracy A – What Action did you take? After I missed out on this opportunity, we did rigorous EDA and identified many good variables R – What was the Result ? Very stable and robust model with better accuracy
Ownership How good are you at taking ownership, are you a ‘minimum guy’ like Srikant 😛
I once identified a problem which the client was facing to identify fraudulent behavior and then approached with a scalable solution
Invent and Simplify How innovative are you? Have you ever took up a complex problem and solved it with a fresh and simple approach? Well, There was a situation when we were trying to build an application to check if two migrated reports are identical or not, we tried to build some pipeline but it was taking a lot of time. There was a freely available software which used to break PDF in screenshots and another application which can match two pictures. Build a simple program to automate these two processes. Simple solution
Are Right, A lot Leaders have a good judgment and they are mostly right. In your previous role did you ever face an issue which you solved using your judgement or past experience? There was a time when my colleague were trying to build a time series model but they were unable to figure out the reason for low accuracy. I asked them to do EDA and figure out some variables which might be a reason for the dip and hiccups. Then I suggested to use a model like ARIMAX which can take both time series and Regression !!
Hire and Develop the Best Do you hire the best or do you hire and develop the best? Tell me one instance when you realize that a person in your team is not the best fit for the role. What did you do? I once had a person to whom I was giving KT in my last organization, he was not able to grasp some concepts. What I did? I made extensive documentation and videos of the complete process of KT, advised him a couple of courses on internet. The output – He was able to pick up things very quickly and was able to deliver everything at a much better speed.
We are well known for our interview books and have 70+ e-book across Amazon and The Data Monk e-shop page . Following are best-seller combo packs and services that we are providing as of now
YouTube channel covering all the interview-related important topics in SQL, Python, MS Excel, Machine Learning Algorithm, Statistics, and Direct Interview Questions Link –The Data Monk Youtube Channel
Website – ~2000 completed solved Interview questions in SQL, Python, ML, and Case Study Link –The Data Monk website
E-book shop – We have 70+ e-books available on our website and 3 bundles covering 2000+ solved interview questions. Do check it out Link –The Data E-shop Page
Instagram Page – It covers only Most asked Questions and concepts (100+ posts). We have 100+ most asked interview topics explained in simple terms Link –The Data Monk Instagram page
Amazon Senior Business Analyst interview questions
Amazon Senior Business Analyst interview questions
Name – Can’t disclose as the person is joining Amazon Previous Company – Mu Sigma Inc. Previous Company Designation – Decision Scientist Work Experience – 3 years 6 months Tools and Technologies used in Previous Company – SQL, Python, Tableau, AWS
Company Interviewing for – Amazon Industry – E-commerce Designation – Senior Business Analyst Role/Level – Can’t disclose Location – Bangalore Interview Year – 2021 Any tips for the aspirants appearing for the company in near future – Work on your SQL skills and go through your resume again and again. Everything will be asked from that Compensation – 5/5
In total there were 4 rounds:- -Written SQL -Technical (SQL and Project) -Technical and Case Study (Python and Case Study) – Hiring Manager
P.S. – We have started Mock interview with the candidate who have cleared these interviews. If you want to take a mock interview with Amazon candidate then Fill this form – Mock Interview with Amazon
Round 1 – Written
Topics covered – SQL Mode of interview – There were 6 people on video call taking the round simultaneously Duration – 45 mins Level of Questions – 7/10
There were 4 table schema and 5 questions on SQL. It was mostly around joins, conditions on Date, self join. There was no IDE to test whether your answer is correct or not.
The questions were not very complex, but you have to understand which all tables to join.
One sample question – Out of all the products if print the product names for which inventory is 2 times the order for March’21. There was inventory, product, order and sales tables given
Round 2 – Technical Topics covered – SQL, Python and Predictive modeling Mode of interview – Video Call Duration – 1 hour Level of Questions – Difficult (9/10)
In any recruitment process the first technical is the most challenging one as it sets your bar for the next round. I started with my introduction. I had a couple of projects on Predictive modeling using Regression, ARIMA, LSTM, and ARIMAX.
I did good in these questions so next was on my modeling experience
Explained the project, the questions were not on the technical part of the model but more around what variables were used in Regression and the meaning of these variables.
Be very very clear about your projects, it will occupy 60% of your time
Round 3 – Case Study and Personality Test Topics covered – Business Case Study and Problem solving Mode of interview – Video Call Duration – 1 hour Level of Questions – Intermediate(8/10)
Topics covered – About role in the previous organization, a brief about the current role Mode of interview – Video Call Duration – 30 mins Level of Questions – NA
It was more about the culture fit. We discussed about one of the projects in my resume. Then there were questions on why I want to join Amazon, what is a normal day in my life looks like. It was a healthy discussion.
The whole process took 1 week, I was offered a position at par with my expected salary.
P.S. – As pointed out by a couple of users about the 12 principle interview being the only way to get into Amazon, it’s not actually true. There are different levels, teams and positions for which recruitment is being done. 12 principle questions are always a part of each round, but we can neither disclose the principles nor those questions. We reconfirmed to find out that Chime was being used for Video call and SQL test, but the application used for taking the written test should not be a concern 😛 We will try to put generic names for mode of interview.
We have started Mock interview with the candidate who have cleared these interviews. If you want to take a mock interview with Amazon candidate then Fill this form – Mock Interview with Amazon
The Data Monk services
We are well known for our interview books and have 70+ e-book across Amazon and The Data Monk e-shop page . Following are best-seller combo packs and services that we are providing as of now
YouTube channel covering all the interview-related important topics in SQL, Python, MS Excel, Machine Learning Algorithm, Statistics, and Direct Interview Questions Link –The Data Monk Youtube Channel
Website – ~2000 completed solved Interview questions in SQL, Python, ML, and Case Study Link –The Data Monk website
E-book shop – We have 70+ e-books available on our website and 3 bundles covering 2000+ solved interview questions. Do check it out Link –The Data E-shop Page
Instagram Page – It covers only Most asked Questions and concepts (100+ posts). We have 100+ most asked interview topics explained in simple terms Link –The Data Monk Instagram page
Today is Day 18 and we will look into Flipkart Business Analyst Interview Questions. Flipkart mostly hires for Business Analyst and Senior Business Analyst position, you can try to get into the Data Science domain after entering into the organization. Kindly let us know if someone is working as a Data Scientist with <3 years of experience and a Bachelor’s degree.
Today we will cover the complete Flipkart Business Analyst Interview Questions and recruitment process
Position – Senior Business Analyst Location – Bangalore Offered Salary – ~12 LPA Experience – ~3 years
Number of Rounds – 5
Round 1 – Aptitude and Logical Reasoning Round 2 – Case Study (Non-Elimination Ro und) Round 3 – Technical Interview SQL and Python Round 4 – Project discussion
Round 1 – Aptitude and Logical Reasoning There were 20 MCQs to be solved in 20 minutes which were mostly from Probability, Time Distance, Percentages, Permutation combinations, and Logical Reasoning.
My project was on Sentiment analysis i.e. NLP part so, I was asked questions on bagging of words, tf-idf, word correlation, tokenization, and a bit on statistics Will share these questions in the NLP division of the website
The Data Monk services
We are well known for our interview books and have 70+ e-book across Amazon and The Data Monk e-shop page . Following are best-seller combo packs and services that we are providing as of now
YouTube channel covering all the interview-related important topics in SQL, Python, MS Excel, Machine Learning Algorithm, Statistics, and Direct Interview Questions Link –The Data Monk Youtube Channel
Website – ~2000 completed solved Interview questions in SQL, Python, ML, and Case Study Link –The Data Monk website
E-book shop – We have 70+ e-books available on our website and 3 bundles covering 2000+ solved interview questions. Do check it out Link –The Data E-shop Page
Instagram Page – It covers only Most asked Questions and concepts (100+ posts). We have 100+ most asked interview topics explained in simple terms Link –The Data Monk Instagram page
Q1.
What are the parts of Microsoft self-service business intelligence solution?
Microsoft
has two parts for Self-Service BI
Excel
BI
Toolkit
It
Allows users to create an interactive report by importing data from different
sources and model data according to report requirement.
Power
BI
It
is The online solution that enables you to share the interactive reports and
queries that you have created using the Excel BI Toolkit.
Q2.
What is self-service business intelligence?
Self-Service Business Intelligence
(SSBI)
SSBI
is an approach to data analytics that enables business users to filter,
segment, and, analyze their data, without the in-depth technical knowledge in
statistical analysis, business intelligence (BI).
SSBI
has made it easier for end users to access their data and create various
visuals to get better business insights.
Anybody
who has a basic understanding of the data can create reports to build intuitive
and shareable dashboards.
Power BI is a cloud-based data sharing
environment. Once you have developed reports using Power Query, Power Pivot and
Power View, you can share your insights with your colleagues. This is where
Power BI enters the equation. Power BI, which technically is an aspect of
SharePoint online, lets you load Excel workbooks into the cloud and share them
with a chosen group of co-workers. Not only that, but your colleagues can
interact with your reports to apply filters and slicers to highlight data. They
are completed by Power BI, a simple way of sharing your analysis and insights
from the Microsoft cloud.
Power BI features allow you to:
Share
presentations and queries with your colleagues.
Update
your Excel file from data sources that can be on-site or in the cloud.
Display
the output on multiple devices. This includes PCs, tablets, and HTML 5-enabled
mobile devices that use the Power BI app.
Query
your data using natural language processing (or Q&A, as it is known).
Q4. What is Power BI Desktop?
Power BI Desktop is a free desktop
application that can be installed right on your own computer. Power BI Desktop
works cohesively with the Power BI service by providing advanced data
exploration, shaping, modeling, and creating report with highly interactive
visualizations. You can save your work to a file or publish your data and
reports right to your Power BI site to share with others.
Q5. What data sources can Power BI connect to?
The list of data sources for Power BI
is extensive, but it can be grouped into the following:
Files:
Data can be imported from Excel (.xlsx, xlxm), Power BI Desktop files (.pbix)
and Comma Separated Value (.csv).
Content
Packs: It is a collection of related documents or files that are stored as a
group. In Power BI, there are two types of content packs, firstly those from
services providers like Google Analytics, Marketo or Salesforce and secondly
those created and shared by other users in your organization.
Connectors
to databases and other datasets such as Azure SQL, Database and SQL, Server
Analysis Services tabular data, etc.
Q6. What are Building Blocks in Power BI?
The following are the Building Blocks
(or) key components of Power BI:
Visualizations: Visualization is a visual
representation of data.
Example: Pie Chart, Line Graph, Side by Side Bar Charts, Graphical Presentation
of the source data on top of Geographical Map, Tree Map, etc.
Datasets: Dataset is a collection of data that
Power BI uses to create its visualizations.
Example: Excel sheets, Oracle or SQL server tables.
Reports: Report is a collection of
visualizations that appear together on one or more pages.
Example: Sales by Country, State, City Report, Logistic Performance report,
Profit by Products report etc.
Dashboards: Dashboard is single layer presentation
of multiple visualizations, i.e we can integrate one or more visualizations
into one page layer.
Example: Sales dashboard can have pie charts, geographical maps and bar charts.
Tiles: Tile is a single visualization in a
report or on a dashboard.
Example: Pie Chart in Dashboard or Report.
Power
BI provides variety of option to filter report, data and visualization. The
following are the list of Filter types.
Visual-level Filters: These filters work on only an
individual visualization, reducing the amount of data that the visualization
can see. Moreover, visual-level filters can filter both data and calculations.
Page-level Filters: These filters work at the report-page
level. Different pages in the same report can have different page-level
filters.
Report-level Filters: There filters work on the entire
report, filtering all pages and visualizations included in the report.
We know that Power BI visual have
interactions feature, which makes filtering a report a breeze. Visual
interactions are useful, but they come with some limitations:
The
filter is not saved as part of the report. Whenever you open a report, you can
begin to play with visual filters but there is no way to store the filter in
the saved report.
The
filter is always visible. Sometimes you want a filter for the entire report,
but you do not want any visual indication of the filter being applied.
Q8. What are content packs in Power BI?
Content packs for services are
pre-built solutions for popular services as part of the Power BI experience. A
subscriber to a supported service, can quickly connect to their account from
Power BI to see their data through live dashboards and interactive reports that
have been pre-built for them. Microsoft has released content packs for popular
services such as Salesforce.com, Marketo, Adobe Analytics, Azure Mobile
Engagement, CircuitID, comScore Digital Analytix, Quickbooks Online, SQL Sentry
and tyGraph. Organizational content packs provide users, BI professionals, and
system integrator the tools to build their own content packs to share
purpose-built dashboards, reports, and datasets within their organization.
Q9. What is DAX?
To do basic calculation and data
analysis on data in power pivot, we use Data Analysis Expression (DAX). It is
formula language used to compute calculated column and calculated field.
DAX
works on column values.
DAX
can not modify or insert data.
We
can create calculated column and measures with DAX but we can not calculate rows using DAX.
Sample DAX formula syntax:
For the measure named Total Sales,
calculate (=) the SUM of values in the [SalesAmount] column in the Sales table.
A- Measure Name
B- = – indicate beginning of formula
C- DAX Function
D- Parenthesis for Sum Function
E- Referenced Table
F- Referenced column name
Q9. What are some of the DAX functions?
Below are some of the most commonly
used DAX function:
The FILTER function returns a table
with a filter condition applied for each of its source table rows. The FILTER
function is rarely used in isolation, it’s generally used as a parameter to
other functions such as CALCULATE.
FILTER
is an iterator and thus can negatively impact performance over large source
tables.
Complex
filtering logic can be applied such as referencing a measure in a filter
expression.
FILTER(MyTable,[SalesMetric]
> 500)
Q12. What are the functions and limitations of DAX?
These are the only functions that allow
you modify filter context of measures or tables.
Add
to existing filter context of queries.
Override
filter context from queries.
Remove
existing filter context from queries.
Limitations:
Filter
parameters can only operate on a single column at a time.
Filter
parameters cannot reference a metric.
Q9. What is SUMMARIZE() and SUMMARIZECOLUMNS() DAX?
SUMMARIZE()
Main
group by function in SSAS.
Recommended
practice is to specify table and group by columns but not metrics.You can use
ADDCOLUMNS function.
SUMMARIZECOLUMNS
New
group by function for SSAS and Power BI Desktop; more efficient.
Specify
group by columns, table, and expressions.
Q14. What are some benefits of using Variables in DAX ?
Below are some of the benefits:
By
declaring and evaluating a variable, the variable can be reused multiple times
in a DAX expression, thus avoiding additional queries of the source database.
Variables
can make DAX expressions more intuitive/logical to interpret.
Variables
are only scoped to their measure or query, they cannot be shared among
measures, queries or be defined at the model level.
Q15. How would you create trailing X month metrics via DAX against
a non-standard calendar?
The
solution will involve:
CALCULATE
function to control (take over) filter context of measures.
ALL
to remove existing filters on the date dimension.
FILTER
to identify which rows of the date dimension to use.
Alternatively, CONTAINS may be used:
CALCULATE(FILTER(ALL(‘DATE’),…….))
Q16. What are the different BI add-in to Excel ?
Below are the most important BI add-in
to Excel:
Power
Query: It helps in finding, editing and loading external data.
Power
Pivot: Its mainly used for data modeling and analysis.
Power
View: It is used to design visual and interactively reports.
Power
Map: It helps to display insights on 3D Map.
Q17. What is Power Pivot?
Power Pivot is an add-in for Microsoft
Excel 2010 that enables you to import millions of rows of data from multiple
data sources into a single Excel workbook. It lets you create relationships
between heterogeneous data, create calculated columns and measures using
formulas, build PivotTables and PivotCharts. You can then further analyze the
data so that you can make timely business decisions without requiring IT
assistance.
Q18. What is Power Pivot Data Model?
It is a model that is made up of data
types, tables, columns, and table relations. These data tables are typically
constructed for holding data for a business entity.
Q19. What is xVelocity in-memory analytics engine used in Power Pivot?
The main engine behind power pivot is
the xVelocity in-memory analytics engine. It can handle large amount of data
because it stores data in columnar databases, and in memory analytics which
results in faster processing of data as it loads all data to RAM memory.
Q20. What are some of differences in data modeling between Power
BI Desktop and Power Pivot for Excel?
Here are some of the differences:
Power
BI Desktop supports bi-directional cross filtering relationships, security,
calculated tables, and Direct Query options.
Power
Pivot for Excel has single direction (one to many) relationships, calculated
columns only, and supports import mode only. Security roles cannot be defined
in Power Pivot for Excel.
Q21. Can we have more than one active relationship between two
tables in data model of power pivot?
No, we cannot have more than one active
relationship between two tables. However, can have more than one relationship
between two tables but there will be only one active relationship and many
inactive relationship. The dotted lines are inactive and continuous line are
active.
Q22. What is Power Query?
Power query is a ETL Tool used to
shape, clean and transform data using intuitive interfaces without having to
use coding. It helps the user to:
Import
Data from wide range of sources from files, databases, big data, social media
data, etc.
Join
and append data from multiple data sources.
Shape
data as per requirement by removing and adding data.
Q23. What are the data destinations for Power Queries?
There are two destinations for output
we get from power query:
Load
to a table in a worksheet.
Load
to the Excel Data Model.
Q24. What is query folding in Power Query?
Query folding is when steps defined in
Power Query/Query Editor are translated into SQL and executed by the source
database rather than the client machine. It’s important for processing
performance and scalability, given limited resources on the client machine.
Q25. What are some common Power Query/Editor Transforms?
Changing Data Types, Filtering Rows,
Choosing/Removing Columns, Grouping, Splitting a column into multiple columns,
Adding new Columns ,etc.
Q26. Can SQL and Power Query/Query Editor be used together?
Yes, a SQL statement can be defined as
the source of a Power Query/M function for additional processing/logic. This
would be a good practice to ensure that an efficient database query is passed
to the source and avoid unnecessary processing and complexity by the client
machine and M function.
Q28. What are query parameters and Power BI templates?
Query parameters can be used to provide
users of a local Power BI Desktop report with a prompt, to specify the values
they’re interested in.
The
parameter selection can then be used by the query and calculations.
PBIX
files can be exported as Templates (PBIT files).
Templates
contain everything in the PBIX except the data itself.
Parameters and templates can make it
possible to share/email smaller template files and limit the amount of data
loaded into the local PBIX files, improving processing time and experience.
Q29. Which language is used in Power Query?
A new programming language is used in
power query called M-Code. It is easy to use and similar to other languages.
M-code is case sensitive language.
Q30. Why do we need Power Query when Power Pivot can import data
from mostly used sources?
Power Query is a self-service ETL
(Extract, Transform, Load) tool which runs as an Excel add-in. It allows users
to pull data from various sources, manipulate said data into a form that suits
their needs and load it into Excel. It is most optimum to use Power Query over
Power Pivot as it lets you not only load the data but also manipulate it as per
the users needs while loading.
Q31. What is Power Map?
Power Map is an Excel add-in that
provides you with a powerful set of tools to help you visualize and gain
insight into large sets of data that have a geo-coded component. It can help
you produce 3D visualizations by plotting upto a million data points in the
form of column, heat, and bubble maps on top of a Bing map. If the data is time
stamped, it can also produce interactive views that display, how the data
changes over space and time.
Q32. What are the primary requirement for a table to be used in
Power Map?
For a data to be consumed in power map
there should be location data like:
Latitude/Longitude
pair
Street,
City, Country/Region, Zip Code/Postal Code, and State/Province, which can be
geolocated by Bing
The primary requirement for the table
is that it contains unique rows. It must also contain location data, which can
be in the form of a Latitude/Longitude pair, although this is not a requirement.
You can use address fields instead, such as Street, City, Country/Region, Zip
Code/Postal Code, and State/Province, which can be geolocated by Bing.
Q33. What are the data sources for Power Map?
The data can either be present in Excel
or could be present externally. To prepare your data, make sure all of the data
is in Excel table format, where each row represents a unique record. Your
column headings or row headings should contain text instead of actual data, so
that Power Map will interpret it correctly when it plots the geographic
coordinates. Using meaningful labels also makes value and category fields
available to you when you design your tour in the Power Map Tour Editor pane.
To use a table structure which more
accurately represents time and geography inside Power Map, include all of the
data in the table rows and use descriptive text labels in the column headings,
like this:
In case you wish to load your data from
an external source:
In
Excel, click Data > the connection you want in the Get External Data group.
Follow
the steps in the wizard that starts.
On
the last step of the wizard, make sure Add this data to the Data Model is
checked.
Q34. What is Power View?
Ans: Power View is a data visualization
technology that lets you create interactive charts, graphs, maps, and other
visuals which bring your data to life. Power View is available in Excel,
SharePoint, SQL Server, and Power BI.
The following pages provide details
about different visualizations available in Power View:
Charts
Line
charts
Pie
charts
Maps
Tiles
Cards
Images
Tables
Power
View
Multiples
Visualizations
Bubble
and scatter charts
Key
performance indicators (KPIs)
Q35. What is Power BI Designer?
Ans: It is a stand alone application
where we can make Power BI reports and then upload it to Powerbi.com, it does
not require Excel. Actually, it is a combination of Power Query, Power Pivot,
and Power View.
Q36. Can we refresh our Power BI reports once uploaded to cloud
(Share point or Powebi.com)?
Ans: Yes we can refresh our reports
through Data Management gateway(for sharepoint), and Power BI Personal
gateway(for Powerbi.com)
Q37. What are the different types of refreshing data for our
published reports?
Ans: There are four main types of
refresh in Power BI. Package refresh, model or data refresh, tile refresh and
visual container refresh.
Package refresh
This synchronizes your Power BI
Desktop, or Excel, file between the Power BI service and OneDrive, or
SharePoint Online. However, this does not pull data from the original data
source. The dataset in Power BI will only be updated with what is in the file
within OneDrive, or SharePoint Online.
Model/data refresh
It referrs to refreshing the dataset,
within the Power BI service, with data from the original data source. This is
done by either using scheduled refresh, or refresh now. This requires a gateway
for on-premises data sources.
Tile refresh
Tile refresh updates the cache for tile
visuals, on the dashboard, once data changes. This happens about every fifteen
minutes. You can also force a tile refresh by selecting the ellipsis (…) in the
upper right of a dashboard and selecting Refresh dashboard tiles.
Visual container refresh
Refreshing the visual container updates
the cached report visuals, within a report, once the data changes.
Q38. Is Power BI available on-premises?
No, Power BI is not available as a
private, internal cloud service. However, with Power BI and Power BI Desktop,
you can securely connect to your own on-premises data sources. With the On-premises
Data Gateway, you can connect live to your on-premises SQL Server Analysis
Services, and other data sources. You can also scheduled refresh with a
centralized gateway. If a gateway is not available, you can refresh data from
on-premises data sources using the Power BI Gateway – Personal.
Q39. What is data management gateway and Power BI personal
gateway?
Gateway acts a bridge between
on-premises data sources and Azure cloud services.
Personal Gateway:
Import
Only, Power BI Service Only, No central monitoring/managing.
Can
only be used by one person (personal); can’t allow others to use this gateway.
On-Premises Gateway:
Import
and Direct Query supported.
Multiple
users of the gateway for developing content.
Central
monitoring and control.
Q40. What is Power BI Q&A?
Power BI Q&A is a natural language
tool which helps in querying your data and get the results you need from it.
You do this by typing into a dialog box on your Dashboard, which the engine
instantaneously generates an answer similar to Power View. Q&A interprets
your questions and shows you a restated query of what it is looking from your
data. Q&A was developed by Server and Tools, Microsoft Research and the
Bing teams to give you a complete
feeling of truly exploring your data.
41). What are some ways that Excel
experience can be leveraged with Power BI?
Below are some of the ways through
which we can leverage Power BI:
The
Power BI Publisher for Excel:
Can
be used to pin Excel items (charts, ranges, pivot tables) to Power BI Service.
Can
be used to connect to datasets and reports stored in Power BI Service.
Excel
workbooks can be uploaded to Power BI and viewed in the browser like Excel
Services.
Excel
reports in the Power BI service can be shared via Content Packs like other
reports.
Excel
workbooks (model and tables) can be exported to service for PBI report
creation.
Excel
workbook Power Pivot models can be imported to Power BI Desktop models.
Q42. What is a calculated column in Power BI and why would you use
them?
Calculated Columns are DAX expressions
that are computed during the model’s processing/refresh process for each row of
the given column and can be used like any other column in the model.
Calculated columns are not compressed
and thus consume more memory and result in reduced query performance. They can
also reduce processing/refresh performance if applied on large fact tables and
can make a model more difficult to maintain/support given
that the calculated column is not
present in the source system.
Q43. How is data security implemented in Power BI ?
Power BI can apply Row Level Security
roles to models.
A
DAX expression is applied on a table filtering its rows at query time.
Dynamic
security involves the use of USERNAME functions in security role definitions.
Typically
a table is created in the model that relates users to specific dimensions and a
role.
Q44. What are many-to-many relationships and how can they be
addressed in Power BI ?
Many to Many relationships involve a
bridge or junction table reflecting the combinations of two dimensions (e.g.
doctors and patients). Either all possible combinations or those combinations
that have occurred.
Bi-Directional
Crossfiltering relationships can be used in PBIX.
CROSSFILTER
function can be used in Power Pivot for Excel.
DAX
can be used per metric to check and optionally modify the filter context.
Q45. Why might you have a table in the model without any
relationships to other tables?
There are mainly 2 reasons why we would
have tables without relations in our model:
A
disconnected table might be used to present the user with parameter values to
be exposed and selected in slicers (e.g. growth assumption.)
DAX
metrics could retrieve this selection and use it with other
calculations/metrics.
A
disconnected table may also be used as a placeholder for metrics in the user
interface.
It
may not contain any rows of data and its columns could be hidden but all
metrics are visible.
46). What is the Power BI Publisher for Excel?
You can use Power BI publisher for
Excel to pin ranges, pivot tables and charts to Power BI.
The
user can manage the tiles – refresh them, remove them, in Excel.
Pinned
items must be removed from the dashboard in the service (removing in Excel only
deletes the connection).
The
Power BI Publisher for Excel can also be used to connect from Excel to datasets
that are hosted in the Power BI Service.
An
Excel pivot table is generated with a connection (ODC file) to the data in
Azure.
Q47. What are the differences between a Power BI Dataset, a
Report, and a Dashboard?
Dataset: The source used to create
reports and visuals/tiles.
A
data model (local to PBIX or XLSX) or model in an Analysis Services Server
Data
could be inside of model (imported) or a Direct Query connection to a source.
Report: An individual Power BI Desktop
file (PBIX) containing one or more report pages.
Built
for deep, interactive analysis experience for a given dataset (filters,
formatting).
Each
Report is connected to atleast one dataset
Each
page containing one or more visuals or tiles.
Dashboard: a collection of visuals or
tiles from different reports and, optionally, a pinned.
Built
to aggregate primary visuals and metrics from multiple datasets.
Q48. What are the three Edit Interactions options of a visual tile
in Power BI Desktop?
The 3 edit interaction options are Filter, Highlight, and None.
Filter: It completely filter a
visual/tile based on the filter selection of another visual/tile.
Highlight: It highlight only the
related elements on the visual/tile, gray out the non-related items.
None: It ignore the filter selection
from another tile/visual.
Q49. What are some of the differences in report authoring
capabilities between using a live or direct query connection such as to an
Analysis Services model, relative to working with a data model local to the
Power BI Desktop file?
With a data model local to the PBIX
file (or Power Pivot workbook), the author has full control over the queries,
the modeling/relationships, the metadata and the metrics.
With a live connection to an Analysis
Services database (cube) the user cannot create new metrics, import new data,
change the formatting of the metrics, etc – the user can only use the
visualization, analytics, and formatting available on the report canvas.
With a direct query model in Power BI
to SQL Server, for example, the author has access to the same features (and
limitations) available to SSAS Direct
Query mode.
Only
one data source (one database on one server) may be used, certain DAX functions
are not optimized, and the user cannot use Query Editor functions that cannot
be translated into SQL statements.
Q50. How does SSRS integrate with Power BI?
Below are some of the way through which
SSRS can be integrated with Power BI:
Certain
SSRS Report items such as charts can be pinned to Power BI dashboards.
Clicking
the tile in Power BI dashboards will bring the user to the SSRS report.
A
subscription is created to keep the dashboard tile refreshed.
Power
BI reports will soon be able to be published to SSRS portal
The Data Monk services
We are well known for our interview books and have 70+ e-book across Amazon and The Data Monk e-shop page . Following are best-seller combo packs and services that we are providing as of now
YouTube channel covering all the interview-related important topics in SQL, Python, MS Excel, Machine Learning Algorithm, Statistics, and Direct Interview Questions Link –The Data Monk Youtube Channel
Website – ~2000 completed solved Interview questions in SQL, Python, ML, and Case Study Link –The Data Monk website
E-book shop – We have 70+ e-books available on our website and 3 bundles covering 2000+ solved interview questions. Do check it out Link –The Data E-shop Page
Instagram Page – It covers only Most asked Questions and concepts (100+ posts). We have 100+ most asked interview topics explained in simple terms Link –The Data Monk Instagram page
Location – Bangalore Job Title – Business Analyst Experience Required – 2-3 Years Number of Rounds – 4
Round 1 – Aptitude and Guesstimate Questions(Written)
Round 2 – SQL round(Written)
Round 3 – Technical Interview with hands-on coding(SQL, R/Python, and MS Excel)
Round 4 – HR round
The Data Monk services
We are well known for our interview books and have 70+ e-book across Amazon and The Data Monk e-shop page . Following are best-seller combo packs and services that we are providing as of now
YouTube channel covering all the interview-related important topics in SQL, Python, MS Excel, Machine Learning Algorithm, Statistics, and Direct Interview Questions Link –The Data Monk Youtube Channel
Website – ~2000 completed solved Interview questions in SQL, Python, ML, and Case Study Link –The Data Monk website
E-book shop – We have 70+ e-books available on our website and 3 bundles covering 2000+ solved interview questions. Do check it out Link –The Data E-shop Page
Instagram Page – It covers only Most asked Questions and concepts (100+ posts). We have 100+ most asked interview topics explained in simple terms Link –The Data Monk Instagram page
Publicis Sapient, formerly Sapient, is a digital transformation partner helping established organizations get to their future, digitally-enabled state by fusing strategy, consulting and customer experience with agile engineering. It was founded in 1990. Location – Bangalore
Job Title – Business Analyst
Experience required – 1-3 years
Number of Rounds – 4Round 1 – Telephonic Round
The telephonic interview lasted for ~45 minutes where the questions were mostly on the tools and technologies I have worked on in my previous organization. Slowly, the questions shifted to SQL and statistics. Following are the questions which were asked:-
What is the output of SELECT NULL+0? NULL
What are the ranking functions in SQL?
There are mainly 3 types of ranking functions:-
Rank()
Row_Number()
Dense_Rank()
What is a partition by clause and how is it used?
Partition by clause is used to create to divide the whole data in different parts depending on the column on which it is partitioned. Suppose the data contains 50 rows and have data for 6 States, then if you do a partition by on state, the whole data set will be treated differently on all the 6 partitions. The syntax for partition by in ROW_NUMBER() is given below
SELECT *, ROW_NUMBER() OVER (PARTITION BY State ORDER BY population DESC) AS row_num
FROM Table_Name
So, a new column will be added in the result as row_num and it will give a row number to all the state row starting from 1. Once the rows of a particular state is over, then it will again take up another state and will start the counting from there
What is A/B Testing?
A/B testing is a form of statistical hypothesis testing with two variants leading to the technical term, two-sample hypothesis testing, used in the field of statistics. In simple words, A/B Testing in web analytics is used to compare the performance of 2 web design to get a better design. Suppose you have 2 designs to display an advertisement on your website, one being a picture and other a text or link. So, you can compare the performance of the two design by A/B Testing.
What is regression?
Regression is a form of predictive modeling technique to determine the strength of the relationship between a dependent and independent variable. One of these variables is called a predictor variable whose value is gathered through experiments. The other variable is called the response variable whose value is derived from the predictor variable.Y=aX+b – Linear regression (X is predictor variable and Y is response variable)
Give some example of regression?
Regression is used for forecasting, time series modeling and finding the casual effect relationship between the variables. For example, the relationship between rash driving and the number of road accidents by a driver is best studied through regression.
What is a multiple regression? Multiple regression is an extension of linear regression into the relationship between more than two variables. In simple linear relation we have one predictor and one response variable, but in multiple regression, we have more than one predictor variable and one response variable.Y=a1x1+a2x2+..+b
What is DENSE_RANK() function?
DENSE_RANK() again is a ranking function which is very similar to RANK() function. The only difference is that it does not miss any rank even if there are duplicates in the table.
Syntax of DENSE_RANK() function
SELECT *, DENSE_RANK() OVER (PARTITION BY Column1 ORDER BY Column2 DESC)
FROM Table_Name
There was a question on self-join where you have to get the employee name and manager name from a table having 3 columns, EmployeeID, EmployeeName, ManagerID
SELECT e1.Name AS EmployeeName, e2.Name AS ManagerName
FROM Employee AS e1
INNER JOIN Employee AS e2
ON e1.ManagerID = e2.EmplyeeID
There were a few questions on the project you are working on right now
Round 2 – Case Study
The Case Study topic was to recommend two food items to a customer who is new to the restaurant. You can find the complete analysis of this case study and other case studies here
Round 3 – Face to Face Technical Round This round was mostly about past projects. I had a Natural Language Processing project, so the interview revolved around the same topic. Following questions were asked in this round:-
1. What was the project for?
A. The project was to do sentiment analysis on the survey data filled by online customers.
2. What algorithms/methods did you try?
A. We tried multiple algorithms, starting from TF-IDF, Part-Of-Speech tagging, n-gram, Lemmatization, Stemming, Tokenization, Latent Semantic Indexing, Sentiment Analysis.
3. What all methods do you need to perform in order to convert a keyword into its base form(Normalization)?
A. Lemmatization and Stemming
4. What is N-gram?
A. N-grams are simply all combinations of adjacent words or letters of length n that you can find in your text file.
For example
This is a sentence
N-grams = This is, is a, a sentence
5. What is the use of TF-IDF?
A. TF-IDF stands for Term Frequency and Inverse Document Frequency. TF-IDF is numerical statistics that help to understand the importance of a particular word in a document. Term frequency gets you the number of times a particular word has occurred in a document and Inverse Document Frequency gets you the importance of the words. It helps out in filtering out the most common words like a, an, the, was, etc.. So, you get only the important terms.
6. What is Lemmatization?
Lemmatization takes into account the morphological analysis of the word. It converts a word into its pure root form by looking into the morphological information
studies – Third person, singular number, present tense of verb study
Lemma – study
studying – Gerund of the verb study
Lemma – study
As you can see, both the words studies and studying has been narrowed down to the lemma study. 7.
7. Explain the complete flow of your NLP project
A. The brief of the process is given below with some coding examples:- Step 1 – Get the text dataset Step 2 – Tokenize the text using get_text() in Python Step 3 – Split the text using
tokens = [t for t in text.split()] Step 4 – Get the count of the word frequency using the NLTK package in Python
freq = nltk.FreqDist(tokens) Step 5 – Remove stop words. Code below
for token in tokens:
if token in stopwords.words(‘english’):
clean_tokens.remove(token)
Step 6 – Tokenize non-English words Step 7 – Get synonyms and antonyms using WordNet package from NLTK in Python Step 8 – Stemming of words. I used PorterStemmer algorithm
stem_Word = PorterStemmer Step 9 – Once we are done with stemming, go for Lemmatization. WordNet package
lemma = WordNetLemmatizer() Step 10 – Build a classifier. We can you Logistic Regression to create a baseline model. Later we used Naive Bayes Classification.
There were questions only on the logical part of the process and not on the code implementation. But, it’s always better to infuse coding example wherever you can. The interview lasted for around 1 hour.
Round 4 – Human Resource
Basic questions, like:-
1. Why are you quitting your present job?
2. What are your expectations with the company? and the company’s expectation
3. Salary negotiation
4. Have you ever lead a team?
Salary offered – Best in the industry (5/5)
The Data Monk services
We are well known for our interview books and have 70+ e-book across Amazon and The Data Monk e-shop page . Following are best-seller combo packs and services that we are providing as of now
YouTube channel covering all the interview-related important topics in SQL, Python, MS Excel, Machine Learning Algorithm, Statistics, and Direct Interview Questions Link –The Data Monk Youtube Channel
Website – ~2000 completed solved Interview questions in SQL, Python, ML, and Case Study Link –The Data Monk website
E-book shop – We have 70+ e-books available on our website and 3 bundles covering 2000+ solved interview questions. Do check it out Link –The Data E-shop Page
Instagram Page – It covers only Most asked Questions and concepts (100+ posts). We have 100+ most asked interview topics explained in simple terms Link –The Data Monk Instagram page