Find Jobs
Hire Freelancers

Big Data and Hadoop Support

₹12500-37500 INR

已关闭
已发布超过 5 年前

₹12500-37500 INR

货到付款
Working on a hadoop project which involves Spark with scala, hive , impala, and sqoop. Looking for daily 2 hours of support Monthly Payout INR 20000 Please mention experience with all these technologies.
项目 ID: 17732704

关于此项目

16提案
远程项目
活跃6 年前

想赚点钱吗?

在Freelancer上竞价的好处

设定您的预算和时间范围
为您的工作获得报酬
简要概述您的提案
免费注册和竞标工作
16威客以平均价₹22,882 INR来参与此工作竞价
用户头像
Hi I am a data engineer with 3+ years of experience in the industry. I have worked on deploying highly scalable, resilient and durable solutions using big data tech both in clouds and on-premise. I have expertise in following areas 1. Spark pipelines in aws and on premise 2. Data ingestion to hdfs and hive using sqoop and vice versa 3. Streaming analytics with Kafka and spark streaming 4. Scala, sbt 5. Hdp and HDF distribution both admin and developer track Looking forward to hearing from you.
₹25,000 INR 在10天之内
4.9 (22条评论)
5.1
5.1
用户头像
I have 2 years of experience in all these technologies and have certification in Spark and Hadoop ecosystem as well. Lets discuss in chat to finalize the deal.
₹22,222 INR 在10天之内
0.0 (0条评论)
0.0
0.0
用户头像
i am certified Hadoop developer worked on many projects.
₹18,888 INR 在20天之内
0.0 (0条评论)
0.0
0.0
用户头像
Has been working on these technologies since a year and has gained a decent amount of knowledge . Hadoop - 1.6 Years Sqoop - 6 months Spark With Java - 1 Year Scala - 3 months Hive - 1 year
₹15,555 INR 在15天之内
0.0 (1条评论)
0.0
0.0
用户头像
What would be the kind of work? Data scrubs, transformations? or what kind of processing you would need to perform?
₹22,222 INR 在10天之内
0.0 (0条评论)
0.0
0.0
用户头像
Currently, i am working on the same skills as you required having 1.5 years of experience. I am graduated from one of the India's top most institute's called NIT's. I am workaholic. I can support up to 6 months. Relevant Skills and Experience Spark,Hive,Sqoop,Scala, Java,Sql,Linux
₹22,222 INR 在90天之内
0.0 (0条评论)
0.0
0.0
用户头像
• Possess 2years of analysis and development experience in working projects and prototypes. • Hands on experience on major components of Hadoop ecosystem like Apache Spark, Map Reduce, HDFS, HIVE, PIG, Sqoop and HBASE. • Implemented apache spark procedures like test analytics and processing using the in-memory computing capabilities. • Experience in Exporting and importing the data using Sqoop from HDFS to relational data base system. • Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture. • Involved in the Software Development Life Cycle (SDLC) phases which include Analysis, Design, Implementation, Testing and Maintenance. • Excellent problem solving and communication. • Learning and organizing skill matched with the ability to manage stress, time and people effectively. • Ability to prioritize tasks and planning the work load based on the importance. • Adaptability to any environment in a short span of time.
₹20,000 INR 在10天之内
0.0 (0条评论)
0.0
0.0
用户头像
From past 4 years, I have been working on various components of hadoop ecosystem including Spark, Impala, hive, sqoop and various hadoop components. Yes, I can provide you the assistance required. Relevant Skills and Experience The reason for deploying spark, impala was to have a fast execution of query and jobs. Since, it's implementation we never missed the traditional MapReduce for running hive on spark jobs.
₹55,555 INR 在10天之内
0.0 (0条评论)
0.0
0.0
用户头像
4 years of experience in Hadoop ecosystem and Spark, Scala
₹12,500 INR 在10天之内
0.0 (0条评论)
0.0
0.0
用户头像
Currenly works in hadoop project which involves Spark with scala, hive , impala, sqoop, spark and python as hadoop developer. As this work requires no experience, i can also complete this work as experience fellow. I can work on daily basis for 2 hrs.
₹22,222 INR 在12天之内
0.0 (0条评论)
0.0
0.0
用户头像
I have 4 years of experience on Hadoop eco systems. Have a good working experience with the cloudera flatform. Have Experience in Sqoop,Flume, Kafka, mapreduce, Pig, Hive, Habse, Cassandra Spark core, Spark SQL, can create and tranform rdds in Python API
₹27,777 INR 在10天之内
0.0 (0条评论)
0.0
0.0

关于客户

INDIA的国旗
Pune, India
0.0
0
会员自9月 8, 2018起

客户认证

这个客户的其他工作

DevOps Support
₹12500-37500 INR
谢谢!我们已通过电子邮件向您发送了索取免费积分的链接。
发送电子邮件时出现问题。请再试一次。
已注册用户 发布工作总数
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
加载预览
授予地理位置权限。
您的登录会话已过期而且您已经登出,请再次登录。