Jump to content

Amena Zepherin

Administrators
  • Posts

    32
  • Joined

  • Last visited

  • Days Won

    2

Amena Zepherin last won the day on August 19

Amena Zepherin had the most liked content!

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Amena Zepherin's Achievements

2

Reputation

  1. Hi Everyone, A huge thanks to everyone who supported and participated in Project:Hack10! I have attached the consolidated feedback and scores from all of our judges. We hope that you find the feedback helpful in preparing for Project:Hack11! It will be our first in-person hack in 2021! Hope to see you there! What do you think? What would you do to improve? Feedback Hack10 - PDA website upload.xlsx
  2. Challenge Overview Can we use GPT-3 to break down projects into specific tasks that need completing? Can we use schedule data for training this model? Challenge Breakdown This challenge uses GPT-3 a heavily restricted natural language model. Consider the classic scenario of the task breakdown for creating a cup of tea. Your task will be to use GPT-3 to create a tool that can break down projects into the tasks needed to complete them. This will allow project managers to compare to current tasks breakdowns and inform on future ones. For those of you who faced a GPT-3 challenge before you will find yourself once again challenged and once again shocked by the possibilities of GPT-3.
  3. Challenge Overview Can we use an application to allow employees to log quickly and easily what work they have done on each project? Can I receive a live visualization on what work has been logged by my teams? Challenge Breakdown This challenge aims to tackle one of the biggest problems facing companies today. How to effectively monitor staff work. This will involve creating a simple and intuitive app to collect staff timesheets and present them live to management for them to have a clear picture of what work is being done on what project. You will need to carefully consider the User Interface and what should be done to make this process as streamline and fast as possible, even for those not used to recording timesheet and using applications.
  4. Challenge Overview Can we predict when a piece of equipment will have a fault by using historical maintenance data? Can we use our equipment failure data to streamline our maintenance and reduce cost? Challenge Breakdown This challenge is all about working with an unclean data set. You will have to sort through the data to find insight and potentially use language processing to find the root cause of equipment failure from maintenance logs. You will be provided with two data sets, one with long descriptions and one without. You can pick and choose what you use for insight with the goal of streamlining equipment maintenance and reducing how often equipment will need to be checked and potentially see if equipment failure can be pre-empted.
  5. Challenge Overview Can we use analytics to determine how best to approach winning public sector contract? Can we determine if our bids will be successful and if they are worth investing in? Challenge Breakdown This challenge is to see how we can use past sales data to inform actions – predictive and prescriptive analytics – for future sales actions. You will be using public sector contract data to see if it can be used to determine how to approach contract bids in the future. You will need to sort through the data and link it to other data sources as well as potentially using word analysis to see what common trends there are in the comments about project development.
  6. Challenge Overview Can we gain useful insight into when a project is likely to go over budget and when the work hours dedicated to a project are running beyond expectation? Can we visualise this information in order for it to be easily represented and shown to management and key decision makers in the company? And in a way that clearly identifies potentially best & worst practice by different contract delivery teams? Challenge Breakdown This challenge is about measuring project performance. You will be provided with data regarding projects timelines and budget and it will be up to you to compare teams and contracts in order to help mitigate the risk of struggling projects. You will need to prepare the data in a dashboard to highlight successful teams and offer advance warning on when a project may start going overbudget. There are a lot of possibilities in the data and it will be up to you to find what the most impactful trends are.
  7. Challenge Overview Can we create a python script to scrape target websites and trim articles to appropriate size? Can we use automation to send those trimmed articles out to key people in a weekly email? Challenge Breakdown This challenge is about web scraping and summarising. Your job is to create a tool that accesses certain company websites, takes recently posted articles, summarises them, then sends a weekly email to employees. This email will contain all that week’s summarised articles in an easily digestible form. This tool could be used in almost any industry and there are many opportunities to sharpen your coding and automation skills. We will also be providing access to GPT-3 the exciting natural language processing tool. In addition to creating the program to collect the articles you are also welcome to use GPT-3's inbuilt tools to explore an alternate way to summarise and format the data.
  8. Challenge Overview Can we use automation to scrape through and catalogue our PDF training certificates? Can we create a tool that can be used for different awarding bodies? Challenge Breakdown A challenge all about analysing and storing the data from PDF certificates. In this challenge you will need to develop a tool or set of tools to read in PDF's with different formatting from different awarding bodies. This means that company training can be automatically stored and processed. This tool will be a massive time saver for companies that do a lot of regular training sessions.
  9. Challenge Overview Can we analyse all our scope 3 emission data and use it to find the outliers to target for improvement? Can we create an app that allows us to view our scope 3 emissions live and that can show us our emissions based on certain filters? i.e. Location, region, source Challenge Breakdown This challenge is all about reporting Scope 3 emissions. There is a lot to consider when it comes to what a company is indirectly responsible for when it comes to emissions and this challenge will be about sorting through that data and trying to find where they can be reduced. You will also be producing a system to calculate employee commuting emissions based on their address (data anonymised) and using that to find out which offices have the highest commuting emissions.
  10. Challenge Overview Can we create an app that can be used on sites to record our hazard data? Can we use object recognition to complete hazard reports? Challenge Breakdown This challenge is about using object recognition to partially automate hazard reporting. This challenge has a clear use case in various industries and may become a great tool for hazard reporting in the future. You will need to create an application that can take a picture of a hazard and attempt to complete a hazard reporting tool on its own.
  11. Challenge Overview Can we create a tool to automate the collection of monthly report updates and combine them into a single document? Can we ensure that correct formatting is adhered to and that as much of the process is automated as possible? Challenge Breakdown This challenge is to create an automated tool that produces monthly project reports and heavily reduces manual admin. You will be creating a tool with a wide potential application that will be relevant to almost all industries. It will be important to make your tool as easy to use as possible and present it as an efficient time saver.
  12. My name is Alex, an MSc Data Science postgraduate student with Robert Gordon University in Aberdeen and member of the teams that achieved 2nd and 3rd place in Project:Hack 8 and Project:Hack 9 respectively. Having graduated in law, worked in banking and asset management, and then pursued a master’s degree in data science, I love walking out of my little domain bubble. What are other companies doing, what are other industries doing and how are they doing it? You never know what you are going to learn or who you are going to meet, and information is valuable. It gives you perspective. These questions brought me to Project:Hack 8 in March of this year and, subsequently, Project:Hack 9 in June. Both were well organised and structured by Projecting Success, with technical support ready if needed. The challenges were clear, practical, and commercially relevant, and the pool of participants diverse. From students, apprentices and newly minted coders to data scientists, project managers and the wonderful people from Gleeds: James Garner, Nicola Herring, Manojit Sarkar, Nahid Jafar, Sheldon Atkinson and Basel Yousef, who welcomed and treated me like an equal member of the team although I was just a student. During Project:Hack 9 we tackled Challenge 11, namely a cost prediction model that uses machine learning to improve cost forecasting on construction projects. Whilst this challenge was within my field of data science, I had limited knowledge of the context in which the solution was being developed, namely construction projects, and two of the programming languages. But I was curious and ready to put in the effort. We were pragmatic and creative in our approach, a reflection of Gleeds’ own vision and values, and within 18 hours we created an innovative solution that brought us 3rd place. As well as working in HTML and Java for the first time, and more interaction with Power Apps, I also gained deeper insight into an industry that I had previously thought as difficult to access given my background but which I am now actively considering shifting to from financial services. As with everything in life, an opportunity is only as valuable as you make it. Whether looking to put your skills into practice, expand your portfolio or strengthen your business acumen for graduate programme applications, Project:Hack is a well worth investment as a student. Be brave, be intrigued and be yourself. The challenge sponsors want to work with you, they want to hear your ideas and see what you can create. Your input matters. As for me? I’m a convert. With two successful hacks behind me and many challenges still to be tackled my work here is not done. Prior commitments are keeping me busy in August however roll on Project:Hack 11!
  13. If you’d ask me 6 months ago what a data analyst did, I would've given you a shrug at best, but after the intensive week involving a crash course in Python Scripting and Power BI followed by my first Hackathon, I think I’ve turned into a data evangelist. Apprentices can take advantage of the Government Apprenticeship Levy, enabling them to get advanced analytics training at no cost to themselves or their employers. I previously spent two days at Project:Hack as a complete novice. Without a lot of tech skills to offer my small team, I focused on the User Stories, dashboards and the presentation. But what I saw being developed by the hundreds of people in attendance was staggering. As a more mature student with a well-established role at Costain, I don’t really see myself becoming a full-time data analyst. What I would like to do with the skills I gained from the apprenticeship is take the new skills back to the projects to produce and share useful project management tools. Also I see a side-quest in championing a Data-as-an-Asset movement. Data is much like any real physical asset. It can be optimized in vast quantities, learned from and focused onto real world outputs. To do this we need people to translate Client Needs into problem statements that have answers lurking inside of data. This is where I see myself operating. Andrews' foresight for data analytics is inspiring. Every intake at the academy demonstrates to us the value in bringing data analytics to project delivery and it's great to see them go on to pioneer change within their organizations.
  14. I’m loving learning about data analytics in a well-structured way. I’ve tried learning more about data analytics before but there’s so much information out there and it’s difficult to separate the good from the bad. Before the apprenticeship, I struggled with Data visualisation and Python but now I have developed a great love for both. The tutors explained all the concepts in an understandable way and the lessons were very rewarding. I have already completely changed careers to be much more involved with data analytics and I believe this is all down to the apprenticeship. It has given me the key skills to be an effective data analyst and it’s also given me the grounding to then be able to look at other learning opportunities and add them to my knowledge base. My challenge required analysing material requisition data from a large infrastructure project. I wanted to get an understanding of what materials prices, price variations and ultimately try to develop a tool which could identify and alert when the best time to buy was. A couple of data scientists joined the team and we seemed to be making good progress until we realised the data quality was bad. The project ended up being an exercise on data quality, how to improve it and what impact the poor data quality had on the final output.
  15. We all know projects are extremely complex system that are influenced by extremely large number of factors. Often project failure can be due to hidden causes that can sneak up on project managers. Data holds the key to preventing this, as it allows us to peer into the past and better understand our mistakes. In many projects, one of the main major issues holding us back is that we lack the quantity of useful data needed to perform in depth project data analytics and the data we do have is limited in scope. Currently, project managers mainly pull from 1st party datasets (Datasets with a direct relationship with the problem), which does provide some insight. However, do many people know of the true power of 3rd party data? What is Third Party Data? Third party data is data that sits outside the organisation. Often the data has no direct relationship with problem at hand. A good example would be when a project manager wants to know why there are delays in the a construction project, they may use weather data to provide deeper insight into the condition on site. In doing this project managers can have a holistic view of the project. Project professionals may find this is interesting, however are there any real world examples of third party data being used? Well the winners of Project:Hack 8 (Challenge 5a) displayed the power of third party by solving this problem. Here cost managers need to understand price trends in key raw construction materials, in order to know when is best to buy them. So the team took data on construction material from the Office of National Statistics and third party data from the London stock exchange, to provide insight into the price of these raw materials. This allows cost managers to make more informed decision, when buying these materials and also it allows data scientists to better machine learning models that make more accurate predictions. AWS Data Exchange In the previous example, code had to be developed and deployed to allow for the extraction and integration of the third party data. This process is time consuming and costly. AWS Data Exchange saves on these cost greatly, as this service provides pre-built data pipelines connecting data subscribers (consumers of data) and providers (producers of data). The service acts as a place where data subscribers can gain easy access to third party data. Meanwhile, for data providers it acts as a market place, where they can sell their data on. This enable through the AWS Data Exchange API. Project professionals and project data analysts will have access to data from all across the world and across industries. The marketplace offers useful data sets including data on COVID-19 to data from the public sector. These new data source can help expand the scope of project data analytics and thereby allow management to make more informed decisions. How does this all work? Once sourced, this data can be then stored in Amazon Simple Storage Service (Amazon S3), which is a scalable storage service. This data can be further analysed with AWS analytics and Machine Learning services to provide important insights. So when looking to advance your analytics, think about third party data sets and how they can enrich your data.
×
×
  • Create New...