Job Responsibilities: The Analyst will work with lead analysts to deliver analytics by a. Building analytics products for to deliver automated, scaled insights in self-serve manner (on PBI/Tableau platform) b. Assisting with complex data pulls and data manipulation to develop Analytics dashboards or conduct analytics deep dives c. Scaling current efforts to productize Analytics delivery by implementing “out of the box” solutions to deliver insights and recommend design enhancements on existing products (Familiarity with AI visuals on PBI will help the cause) Requirements & Qualifications: • 4-8 years of experience in Analytics • Strong logical, analytical, and problem-solving skills • Good understanding of digital and data analytics &bul...
You are required to setup a multinode environment consisting of a master node and multiple worker nodes. You are also required to setup a client program that communicates with the nodes based on the types of operations requested by the user. The types of operations that expected for this project are: WRITE: Given an input file, split it into multiple partitions and store it across multiple worker nodes. READ: Given a file name, read the different partitions from different workers and display it to the user. MAP-REDUCE - Given an input file, a mapper file and a reducer file, execute a MapReduce Job on the cluster.
I am working on a YouTube dataset where I have to get 5 insight using spark, hive ,HDFS and Elastic search