Careers

Join us as we enable the third wave of computing in the data center using FPGAs. Open positions are listed below.

Megh’s founders are Intel veterans who pioneered the adoption of FPGAs in Data Center. We’re based in Portland, Oregon, and have development offices in Bangalore, India. We offer a fast-paced, exciting work environment with competitive salaries and benefits.

To apply, send your resume to jobs@meghcomputing.com.

Big Data Framework Engineer

Megh Computing is looking for a Big Data framework Developer to create/build Realtime Data Analytics Solutions on Intel PAC Card using Intel OPAE stack. You must be a self-motivated, team player who is excited about working on leading edge technologies to solve customer’s problems and driving the success of the company.

Responsibilities

Primary responsibilities include:

  • SPARK, SPARK Streaming, Kafka and Big Data framework Core feature development.
  • Big Data application development using JAVA, SPARK, Python and SCALA.
  • Possibly contribute to Opensource git community.
  • Instrument, Analyze and Optimize SPARK Core internals.
  • Design and implement unit tests for the solution within the test framework.
  • Work with software architects to design and implement applications and SW infrastructure:
    • With reviews at each stage to ensure integration into the larger system
    • With an eye to future maintenance
    • With simplicity and clarity

Qualifications and Experience

The following qualifications are required:

  • BS/MS with 4-10 years relevant experience.
  • Degree programs in CS, CE, EE or similar technical field.
  • Development experience in JAVA, SCALA and Python.
  • Knowledge of SPARK, SPARK Streaming, Kafka.

The following qualifications are highly desirable (a successful candidate need not possess all of these qualifications, although the more the better):

  • Strong technical and problem solving skills.
  • Strong written and verbal communications skills.
  • Ability to define and execute tasks with limited direction.
  • Knowledge of internals of Linux and/or Windows (both are desired).
  • Knowledge of FPGA technology.
  • Experience in test or validation application development
  • Experience with source code development, control, review and maintenance in C/C++.

Linux Apps/Kernel Engineer

Megh Computing is looking for a Linux Apps/Kernel Engineer to design/develop compiler/runtime that maps Realtime Data Analytics Solutions on Intel PAC Card using Intel OPAE stack. You must be a self-motivated, team player who is excited about working on leading edge technologies to solve customer’s problems and driving the success of the company.

Responsibilities

Primary responsibilities include:

  • Runtime/Compiler Application code development.
  • Design and implement unit tests for the solution within the test framework.
  • Work with software architects to design and implement applications and SW infrastructure:
    • With reviews at each stage to ensure integration into the larger system
    • With an eye to future maintenance
    • With simplicity and clarity

Qualifications and Experience

The following qualifications are required:

  • BS/MS with 4-10 years relevant experience.
  • Degree programs in CS, CE, EE or similar technical field.
  • Development experience with C++ and/or C.
  • Knowledge of Linux User and/or Kernel Mode development.

The following qualifications are highly desirable (a successful candidate need not possess all of these qualifications, although the more the better):

  • Strong technical and problem solving skills.
  • Strong written and verbal communications skills.
  • Ability to define and execute tasks with limited direction.
  • Knowledge of internals of Linux and/or Windows (both are desired).
  • Knowledge of FPGA technology.
  • Experience in test or validation application development.
  • Experience with source code development, control, review and maintenance in C/C++.

FPGA RTL Engineer

Megh Computing is looking for a FPGA RTL Engineer to develop infrastructure/algorithms for Realtime Data Analytics Solutions on Intel PAC Card using Intel OPAE stack. You must be a self-motivated, team player who is excited about working on leading edge technologies to solve customer’s problems and driving the success of the company.

Responsibilities and Duties

Primary responsibilities include:

  • Develop FPGA RTL infrastructure to allow offload on compute intensive algorithms on to FPGA.
  • Redesign/remap Big Data/DL algorithm to FPGA.
  • Design and implement unit tests for the solution within the test framework.
  • Work with software architects to design and implement applications:
    • With reviews at each stage to ensure integration into the larger system
    • With an eye to future maintenance
    • With simplicity and clarity

Qualifications and Experience

The following qualifications are required:

  • BS/MS with 4-10 years relevant experience.
  • Degree programs in CE, EE or similar technical field.
  • Development experience with RTL/HLS on FPGA/ASIC.
  • Experience in developing using VHDL, Verilog and System Verilog languages.
  • Knowledge of Linux User and/or Kernel Mode development.

The following qualifications are highly desirable (a successful candidate need not possess all of these qualifications, although the more the better):

  • Experience with mapping various workloads to CPUs and accelerators, including implementing algorithms on FPGAs.
  • Prior experience working with Heterogenous (FPGA, GPGPU) hardware systems.
  • Strong technical and problem solving skills.
  • Strong written and verbal communications skills.
  • Ability to define and execute tasks with limited direction.
  • Experience in test or validation application development.
Share this page
Facebook
Google+
Twitter
LinkedIn
Request a demo
Find out how to accelerate your Real Time Analytics workloads with Spark Streaming framework and BigDL libraries in the Cloud using FPGA accelerators