深圳(Shenzhen), Guangdong, China
4 days ago
Big Data Engineer
General Information Req # WD00083144 Career area: Data Management and Analytics Country/Region: China State: Guangdong City: 深圳(Shenzhen) Date: Wednesday, May 28, 2025 Working time: Full-time Additional Locations:  * China - Guangdong - 深圳(Shenzhen) Why Work at Lenovo We are Lenovo. We do what we say. We own what we do. We WOW our customers. 
Lenovo is a US$57 billion revenue global technology powerhouse, ranked #248 in the Fortune Global 500, and serving millions of customers every day in 180 markets. Focused on a bold vision to deliver Smarter Technology for All, Lenovo has built on its success as the world’s largest PC company with a full-stack portfolio of AI-enabled, AI-ready, and AI-optimized devices (PCs, workstations, smartphones, tablets), infrastructure (server, storage, edge, high performance computing and software defined infrastructure), software, solutions, and services. Lenovo’s continued investment in world-changing innovation is building a more equitable, trustworthy, and smarter future for everyone, everywhere. Lenovo is listed on the Hong Kong stock exchange under Lenovo Group Limited (HKSE: 992) (ADR: LNVGY). 
This transformation together with Lenovo’s world-changing innovation is building a more inclusive, trustworthy, and smarter future for everyone, everywhere. To find out more visit www.lenovo.com, and read about the latest news via our StoryHub. Description and Requirements

Job Responsibilities: 

1. Develop and maintain the big data platform, including data ingestion, cleaning, preprocessing, modeling, and analysis; 

2. Design and optimize data warehouse systems to support business analysis and data services; 

3. Provide high-quality data solutions to meet business needs; 

4. Participate in operations and optimization of big data clusters such as StarRocks and Hadoop; 

5. Stay up to date with cutting-edge open-source big data technologies and continuously enhance platform capabilities.


Job Requirements:

1. Technical Skills:

Proficient in Java, Shell, SQL and related programming languages;Familiar with big data frameworks such as Hadoop, Hive, HBase, Kafka, Flink, and Paimon;Expert in using, developing, and operating StarRocks.

2. Data Processing Skills:

Experienced in end-to-end data processes: ingestion, cleaning, preprocessing, modeling, storage, and governance.Capable of designing and implementing business-oriented data solutions.

3. Data Warehouse Experience:

Skilled in data warehouse modeling, ETL development, job scheduling, and performance tuning;Experience in Hadoop and StarRocks cluster operation & migration is a plus.

4. Education & Experience:

Bachelor’s degree or above, preferably in Computer Science or related fields.CET-4 or above, with good command of English reading and writing.3+ years of experience in the big data domain.Experience in large-scale data warehouse projects is preferred.

5. Core Competencies:

Strong business understanding to translate requirements into technical solutions;Excellent communication skills for effective collaboration with teams and stakeholders.Strong ability to learn and stay updated with big data technology trends.

3. Preferred Qualifications:

Familiarity with data governance frameworks or platforms;Experience in cross-department collaboration and project management.Familiarity with Supply chain knowledge and processes. Additional Locations:  * China - Guangdong - 深圳(Shenzhen) * China * China - Guangdong * China - Guangdong - 深圳(Shenzhen)
Confirmar seu email: Enviar Email