Traditionally a space only for the wealthy and well-connected, we believe in a future where private markets are more accessible to investors and fundraisers. By leveling the playing field we hope to create a more equitable economy, where inspiring companies are connected to inspired investors, whoever and wherever they are.
Leveraging our trusted brand, global networks and incredible team, we’re building a technology-enabled ecosystem that is as diverse and dynamic as our investor network. As we progress on this ambitious journey, we’re looking for energetic and creative people to support and leave their mark on our platform.
• We have big plans to disrupt the traditional fundraising process for private businesses
• You will work with a diverse team of former investment bankers, strategy consultants and business owners in developing, monitoring, and improving products to facilitate the activity of private investing
• Everything we do is focused on helping build the private capital markets for the next generation of business owners and investors
• We work really hard but play really hard as well
• Work with Data Scientists, Data Engineers, Data Analysts, Software engineers to build and manage data products and the Fundnel Data Warehouse
• Design, develop, and launch extremely efficient and reliable data pipelines.
• Solve issues in the existing data pipelines and build their successors.
• Build modular pipelines to construct features and modelling tables.
• Maintain data warehouse architecture and relational databases.
• Monitor incidents by performing root cause analysis and implement the appropriate action.
• Create, document, and monitor highly readable code.
• Obtain and ingest raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries etc.)
• Conduct and participate in code reviews with peers
• Master's/Bachelor’s degree in Computer Science or any other related field with minimum 6 months of IT experience.
• Minimum 6 months of experience in designing, building and operationalizing medium to large scale data integration
(structured & unstructured) projects with Data Lake, Data Warehouse, BLOB Storage, RDBMS using AWS integration services.
• Minimum 6 month of Hands-on Experience in batch/real-time data integration & processing.
• Strong proficiency in handling databases using MySQL, PostgreSQL, Redshift.
• Solid background in programming languages like Python/Scala/Java etc. Python is a must.
• Build & maintain scalable ETL pipelines using Apache Airflow.
• Prior experience in using Big Data tooling (Hadoop, Spark) and a good understanding of functional programming
The Ideal Candidate Should Be
• Interested to work in the small business (startup) environment
• Possess a strong interest in fintech / financial services
• Interested in working with data, functions, and characteristics of companies
• Challenge conventional thinking and push design to achieve functionality and beauty
• Have a basic understanding of terms in financial statements and financial ratios
If you are interested, please send your resume to firstname.lastname@example.org. Only successful applicants will be contacted. Please come prepared for an intensive round of interviews.