G Research: Data Science Tooling Expert
Seeking a candidate to identify and research new data science and machine learning technologies and support POCs of these technologies as well as other kinds of underlying infrastructure.
At: G Research
Location: London, UK
Position: Data Science Tooling Expert
G-Research is a leading quantitative research and technology company. We research systematic investment ideas that predict the future of financial markets using scientific techniques and the latest research to find patterns in large, noisy and rapidly changing real-world data sets. We use machine learning alongside robust statistical analysis to extract insights from big data. We offer a dynamic, flexible and stimulating environment where good ideas are prized and rewarded. Our culture is positive with a fantastic central London location.
The Technology Innovation Group
G-Research is going through an exciting period of growth and transforming the way we conduct quantitative research as well as run and build our technology platforms. As part of this transformation we are creating a Technology Innovation Group (TIG). This is a great opportunity for an expert in data science tools and technologies to come on board and be part of a team discovering, exploring, setting-up, evaluating and finding ways to best exploit new technologies across the entire stack. We look at all kinds of technology from data science workbenches and visualisation tools to analytics environments, novel data-stores, containerisation, parallelisation, machine learning, server, storage and networking infrastructure.
You will join the core TIG team to identify and research new data science and machine learning technologies and support POCs of these technologies as well as other kinds of underlying infrastructure. You will critically assess new technologies in regards to their ability to support the kind of research performed and planned by the business. You will work with the rest of the TIG to identify, evaluate and demonstrate the benefits of relevant new technologies and ways of working.
The responsibilities of the role include:
- Identifying, researching and evaluating new data science and machine learning tools, technologies and techniques that could be applicable to our research teams
- Participating in technology proofs of concept using your experience of data science tools to both assess and suggest the best way of integrating technologies with those tools and to use your experience of big data queries and machine learning to stress test new technologies like servers, storage, containers and networking technologies
- Using your experience to understand the needs of data scientists and researchers around the business
- Creation of demonstrations and reusable stress tests that can be applied to our existing technologies and new technologies to enable a side-by-side comparison
- Creating reports and visualisations for benefits analysis of new technologies and use in TIG newsletters
- Writing articles on trends in data science and how to select the most appropriate technologies for solving different kinds of problem and how to get the best out of those technologies
The successful candidate will have a background in technology, be fully versed in a variety of data science tools and libraries, be passionate about new technology innovation, and be happy to work in a continuously prototyping environment learning new technologies on a regular basis.
Required skills and experience:
- You will need a strong background in mathematics
- A Masters or PhD degree in a highly quantitative subject (mathematics, statistics, computer science, physics or engineering) is desirable
- Previous financial experience is not required, although interest in finance and the motivation to rapidly learn more is a prerequisite for working here
- Since programming is an important part of the work, knowledge of numerical programming in an object-oriented language such as C#, C++, Scala, Java or Python will be useful
- Experience of a range of data science and big data technologies such as Jupyter Notebook, visualisation plug-ins, programming libraries and open source big data technologies to enable distributed and parallel computing and data access in batch and near-real-time such as Hadoop, Spark, Kafka, Cassandra, GPUs
- Experience working with large data sets (Terabytes) G-Research has high academic hiring standards and prefers degree qualified candidates. One thing all successful candidates have in common is a passion to use technology, new concepts and new ideas to solve complicated problems.
Highly Competitive + Bonus