This job post is no longer available

Data Engineer

Full-Time

Mid Level

Remote

Not defined

01 Dec → 31 Dec

Braintrust

San Francisco, CA, USA

About Company

Braintrust is the first user-controlled talent network which aligns the interests of both talent and enterprises. Braintrust’s unique network model allows talent to retain 100% of their market rate, while enabling organizations to pull together flexible teams of highly-skilled technical and design talent, no matter where they’re located.
View Company Profile
Expired

Job Description

Braintrust is a user-controlled talent network, where you keep 100% of what you earn and actually get to own the platform. We've been onboarding some big clients and specifically need a Data Engineer for our client.

Our client is bridging real-world assets with digital ownership, through an end-to-end property management protocol. Our vision is to shift the NFT craze into a sustainable movement. Where all world assets can be represented on-chain, and with the legal protection required. Combined with an ease of trading, and the endless transactional opportunities that blockchain unlocks.

JOB TYPE: Freelance, Contract Position 

LOCATION: Remote (REQUIRED TIME ZONES: GMT, +1GMT, -5)

HOURLY RANGE: Our client is looking to pay $30 - $90 USD / HR

ESTIMATED DURATION: 40Hrs/Week - Short, one-time project

UK, Europe, or US East Coast timezones only.

C2C Candidates: This role is not available to C2C candidates working with an agency. If you are a professional contractor who has created an LLC/corp around their consulting practice, this is well aligned with Braintrust and we’d welcome your application.

Braintrust values the multitude of talents and perspectives that a diverse workforce brings. All qualified applicants will receive consideration for employment without regard to race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status.

Job Responsibilities

  • You will work with our Product and Engineering teams and our Chief Product Officer to build a data pipeline that will feed a pricing and trade execution service.
  • Your work will support quantitative research and productization.

Requirements / Qualifications

  • 3-4+ years of work experience in data engineering / data heavy software development
  • Strong SQL skills
  • Experience building, optimising and refining data sets, data architectures & data pipelines
  • A general purpose programming language [C#, Java, JS, Python]
  • Experience with cloud based ML tooling, such as sagemaker or tensorflow
  • Experience with cloud based ETL tooling, such as Stitch, Segment, Kafka

How To Apply

Apply through GoRemotely.

View similar jobs