Jump to content

Search the Community

Showing results for tags 'hackathons'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

There are no results to display.

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 7 results

  1. We’re excited to announce the Databricks Generative AI Hackathon winners. This hackathon garnered hundreds of data and AI practitioners spanning 60 invited companies... View the full article
  2. The PartyRock Generative AI Hackathon wrapped up earlier this month. Entrants were asked to use PartyRock to build a functional app based on one of four challenge categories, with the option to remix an existing app as well. The hackathon attracted 7,650 registrants who submitted over 1,200 projects, and published over 250 project blog posts on community.aws . As a member of the judging panel, I was blown away by the creativity and sophistication of the entries that I was asked to review. The participants had the opportunity to go hands-on with prompt engineering and to learn about Foundation Models, and pushed the bounds of what was possible. Let’s take a quick look at the winners of the top 3 prizes… First Place First up, taking home the top overall prize of $20,000 in AWS credits is Parable Rhythm – The Interactive Crime Thriller by Param Birje. This project immerses you in a captivating interactive story using PartyRock’s generative capabilities. Just incredible stuff. To learn more, read the hackathon submission and the blog post. Second Place In second place, earning $10,000 in credits, is Faith – Manga Creation Tools by Michael Oswell. This creative assistant app lets you generate original manga panels and panels with the click of a button. So much potential there. To learn more, read the hackathon submission. Third Place And rounding out the top 3 overall is Arghhhh! Zombie by Michael Eziamaka. This is a wildly entertaining generative AI-powered zombie game that had the judges on the edge of their seats. Great work, Michael! To learn more, read the hackathon submission. Round of Applause I want to give a huge round of applause to all our category winners as well: Category / Place Submission Prize (USD) AWS Credits Overall 1st Place Parable Rhythm – $20,000 Overall 2nd Place Faith – Manga Creation Tools – $10,000 Overall 3rd Place Arghhhh! Zombie – $5,000 Creative Assistants 1st Place Faith – Manga Creation Tools $4,000 $1,000 Creative Assistants 2nd Place MovieCreator $1,500 $1,000 Creative Assistants 3rd Place WingPal $500 $1,000 Experimental Entertainment 1st Place Parable Rhythm $4,000 $1,000 Experimental Entertainment 2nd Place Arghhhh! Zombie $1,500 $1,000 Experimental Entertainment 3rd Place Find your inner potato $500 $1,000 Interactive Learning 1st Place DeBeat Coach $4,000 $1,000 Interactive Learning 2nd Place Asteroid Mining Assistant $1,500 $1,000 Interactive Learning 3rd Place Unlock your pet’s language $500 $1,000 Freestyle 1st Place MindMap Party $1,000 $1,000 Freestyle 2nd Place Angler Advisor $750 $1,000 Freestyle 3rd Place SafeScares $250 $1,000 BONUS: Remix ChatRPG Inferno – $2,500 BONUS: Remix ChatRPG Chat RPG Generator – $2,500 From interactive learning experiences to experimental entertainment, the creativity and technical execution on display was off the charts. And of course, a big thank you to all 7,650 participants who dove in and pushed the boundaries of what’s possible with generative AI. You all should be extremely proud. Join the Party You can click on any of the images above and try out the apps for yourself. You can remix and customize them, and you can build your own apps as well (read my post, Build AI apps with PartyRock and Amazon Bedrock to see how to get started). Alright, that’s a wrap. Congrats again to our winners, and a huge thanks to the PartyRock team and all our amazing sponsors. I can’t wait to see what you all build next. Until then, keep building, keep learning, and keep having fun! — Jeff; View the full article
  3. Today the world celebrates International Women’s Day, which is dedicated to honoring the achievements of women worldwide and advocating for gender equality. This is the perfect opportunity to share a story that truly embodies the spirit of this significant occasion. This blog shares how, in the traditionally male-dominated tech industry, a trio of women embarked on a remarkable journey at the recent Tech to the Rescue hackathon. We hope it inspires women in tech to get involved initiatives that help them lend their time and talents to causes that allow them to make a positive impact in their community, and the world – like those offered through Tech to the Rescue. About the hackathon Tech to the Rescue helps tech companies make an impact by matching them with ambitious nonprofits to boost their causes with technology. This past December, Tech to the Rescue collaborated with AWS to run the Air Quality Hackathon. The hackathon challenged participants to create cutting-edge technology to combat air pollution, the world’s “silent killer.” The event attracted 240 engineers and innovators, supported by more than 50 tech mentors, from five continents and 27 countries. Together, they built 33 solutions for seven air pollution challenges, with $30,000 dedicated to support and donations. Our motivation I’m part of the Psychometrics and Data Science group within AWS Certification, and my job is to create valid and reliable AWS Certification exams for thousands of AWS certification test-takers. In mid-November 2023, I learned about the Air Quality Hackathon and I felt a strong desire to participate. Soon after, I registered for the event as the captain of the ‘Psychometrician’ team with two colleagues, Vinita Talreja and Ye (Cheryl) Ma. None of us had ever participated in a hackathon before, but we were excited to give it a try! December is the busiest time of year for our team and I vividly remember asking my manager, Anjali Weber, for permission to devote two-and-a-half days to this event. She expressed concerns about our end-of-year deliverables and workload. Moreover, psychometrics is a distinct field with minimal overlap to the typical work of developers, data engineers, and data scientists at tech companies, who are more accustomed to the types of challenges this event presented. This implies that our chances of delivering a successful solution were probably slim. But inspired by the AWS leadership principle of ‘learn and be curious’, I assured my manager, “We are not going to win, but this is a great opportunity for the team to think outside the box, engage with a larger community, and learn things that could potentially impact a broader audience.” She responded, “You are all superwomen. You have my full support!” The challenge We selected the seventh challenge, presented by the Centre for Research on Energy and Clean Air (CREA) in Poland. Our task was to create a supervised machine learning (ML) model that could accurately predict Nitrogen oxides (NOx) emissions (provided by CREA as the ground truth values) from 11 power plants in Taiwan. The possible predictor values are embedded in 14,877 satellite images collected between March 1, 2019, and September 30, 2023. Each of the images spanned 39,042 longitudinal and latitude coordinates—11 of them correspond to the power plant locations CREA is monitoring. In total, they amount to 65,395,350 data points. Modeling such a vast dataset presented considerable challenges, further compounded by the fact that for any given plant on any day within this four-year window, predictor values might be missing from the satellite data due to weather conditions. This introduced an additional challenge to data mining, the crucial and most time-consuming step in developing an ML model. The technical journey Facing this daunting data challenge, we delineated the analysis strategy. We first calculated the distance between each power plant and the grid coordinates using the Haversine formula. We then used the distance to filter out relevant data within a 50-mile radius of the plants. This approach not only dramatically reduced the size of the data used in the modeling from 65,395,350 to 106,840, but also creatively and reasonably inputted the missing predictor values for any day at any of those 11 power plants. Data scaling and transformation is another critical step in building a successful ML model. We thoroughly examined the distribution of possible predictors, evaluated their relationships with NOx emission values we needed to predict, and selected different transformation techniques for different predictors that yielded distributions closest to approximately normal. Finally, instead of training a single ML model, we opted to utilize an AutoML (a Python package developed by Amazonians) approach with Amazon SageMaker to determine the relationships between NOx emission values and predictor variables. With all these steps, our AutoML model achieved an R-squared value of 0.9079, indicating highly promising results. Being rigorously trained as researchers, even under an extremely stringent timeline, we managed to document all steps of this project and recorded a 20-minute demo to help hackathon judges and mentors understand the rationales behind the process and evaluate the outcome. After more than 36 hours of racing against the clock, we submitted our results and waited with a quiet sense of hope. Beyond the competition Around 7 a.m. EST in the last day of the hackathon, we received the notification that our solution excelled in the competition—we won first place in one of the seven challenges! After a 30-minute Q&A session with the judges and mentors in the morning, this victory advanced us to the final round, where we were selected among the top three finalists and awarded $1,000. Although we did not secure the top prize in the end, we were told that it was a tough call and the judges were impressed with our prototype’s ability to accurately estimate NOx pollution from power plants, highlighting its promise and potential impact. A creative win While still excited about the outcome, we received another happy surprise: our team also triumphed in the team photo competition. During the 36-hour intensive coding and documenting session, we squeezed in the time and used an AI tool to create a photo depicting us as three happy lady coders with our lovely pets. This win earned us another $1,000 prize, which we donated entirely to our task sponsor, CREA. As victors, we were also allowed to adopt an animal through WWF UK and we all picked penguins. Winning this competition showcases a unique angle that women in tech can bring to the community—blending creativity with compassion and deep empathy for the world around us. Inspiring women in tech Our journey through the Air Quality Hackathon serves as a powerful testament to the impact that women in tech can have, not only within the industry but also in addressing societal and environmental challenges. We went into the event with excitement but unsure how far our skills might take us. But we were willing to be good students and stay open and curious throughout the process. Our diversity, collaboration, and deep analytical thinking led us to success and a high sense of pride at the outcome of our hard work. We hope our experience is a beacon to inspire women in technology to contribute their skills to causes that matter. Your talent and ideas make a difference! As we continue to break barriers and support one another, we hope you’ll join us in using technology for the greater good. Together, we have the power to change the world! View the full article
  4. With all the generative AI announcements at AWS re:invent 2023, I’ve committed to dive deep into this technology and learn as much as I can. If you are too, I’m happy that among other resources available, the AWS community also has a space that I can access for generative AI tools and guides. Last week’s launches Here are some launches that got my attention during the previous week. Amazon Q data integration in AWS Glue (Preview) – Now you can use natural language to ask Amazon Q to author jobs, troubleshoot issues, and answer questions about AWS Glue and data integration. Amazon Q was launched in preview at AWS re:invent 2023, and is a generative AI–powered assistant to help you solve problems, generate content, and take action. General availability of CDK Migrate – CDK Migrate is a component of the AWS Cloud Development Kit (CDK) that enables you to migrate AWS CloudFormation templates, previously deployed CloudFormation stacks, or resources created outside of Infrastructure as Code (IaC) into a CDK application. This feature was launched alongside the CloudFormation IaC Generator to give you an end-to-end experience that enables you to create an IaC configuration based off a resource, as well as its relationships. You can expect the IaC generator to have a huge impact for a common use case we’ve seen. For a full list of AWS announcements, be sure to keep an eye on the What’s New at AWS page. Other AWS news Here are some additional projects, programs, and news items that you might find interesting: Amazon API Gateway processed over 100 trillion API requests in 2023, demonstrating the growing demand for API-driven applications. API Gateway is a fully-managed API management service. Customers from all industry verticals told us they’re adopting API Gateway for multiple reasons. First, its ability to scale to meet the demands of even the most high-traffic applications. Second, its fully-managed, serverless architecture, which eliminates the need to manage any infrastructure, and frees customers to focus on their core business needs. Join the PartyRock Generative AI Hackathon by AWS. This is a challenge for you to get hands-on building generative AI-powered apps. You’ll use Amazon PartyRock, an Amazon Bedrock Playground, as a fast and fun way to learn about Prompt Engineering and Foundational Models (FMs) to build a functional app with generative AI. AWS open source news and updates – My colleague Ricardo writes this weekly open source newsletter in which he highlights new open source projects, tools, and demos from the AWS Community. Upcoming AWS events Whether you’re in the Americas, Asia Pacific & Japan, or EMEA region, there’s an upcoming AWS Innovate Online event that fits your timezone. Innovate Online events are free, online, and designed to inspire and educate you about AWS. AWS Summits are a series of free online and in-person events that bring the cloud computing community together to connect, collaborate, and learn about AWS. These events are designed to educate you about AWS products and services and help you develop the skills needed to build, deploy, and operate your infrastructure and applications. Find an AWS Summit near you and register or set a notification to know when registration opens for a Summit that interests you. AWS Community re:Invent re:Caps – Join a Community re:Cap event organized by volunteers from AWS User Groups and AWS Cloud Clubs around the world to learn about the latest announcements from AWS re:Invent. You can browse all upcoming in-person and virtual events. That’s all for this week. Check back next Monday for another Weekly Roundup! – Veliswa This post is part of our Weekly Roundup series. Check back each week for a quick roundup of interesting news and announcements from AWS! View the full article
  5. With the return of DockerCon, held October 4-5 in Los Angeles, we’re excited to announce the kick-off of a Docker AI/ML Hackathon. Join us at DockerCon — in-person or virtually — to learn about the latest Docker product announcements. Then, bring your innovative artificial intelligence (AI) and machine learning (ML) solutions to life in the hackathon for a chance to win cool prizes. The Docker AI/ML Hackathon is open from October 3 – November 7, 2023. DockerCon in-person attendees are invited to the dedicated hackspace, where you can chat with fellow developers, Dockhands, and our partners Datastax, Navan.ai, Neo4J, OctoML, and Ollama. We’ll also host virtual webinars, Q&A, and engaging chats throughout the next five weeks to keep the ideas flowing. Register for the Docker AI/ML Hackathon to participate and to be notified of event activities. View the full article
  6. See how Applitools Ultrafast Test Cloud can help Online Retailers shape up their apps ahead of the busy holiday shopping season SAN MATEO, Calif., November 10, 2020 — Applitools, provider of a next generation test automation platform powered by Visual AI and Ultrafast Test Cloud, today announced the Applitools Holiday Shopping Hackathon. The holiday themed contest provides […] The post Applitools Announces Online Shopping Holiday Hackathon appeared first on DevOps.com. View the full article
  7. Many organizations host ideation events to innovate and prototype new ideas faster. These events usually run for a short duration and involve collaboration between members of participating teams. By the end of the event, a successful demonstration of a working prototype is expected and the winner or the next steps are determined. Therefore, it’s important to build a working proof of concept quickly, and to do that teams need to be able to share the code and get peer reviewed in real time. In this post, you see how AWS Cloud9 can help teams collaborate, pair program, and track each other’s inputs in real time for a successful hackathon experience. AWS Cloud9 is a cloud-based integrated development environment (IDE) that lets you to write, run, and debug code from any machine with just a browser. A shared environment is an AWS Cloud9 development environment that multiple users have been invited to participate in and can edit or view its shared resources. Pair programming and mob programming are development approaches in which two or more developers collaborate simultaneously to design, code, or test solutions. At the core is the premise that two or more people collaborate on the same code at the same time, which allows for real-time code review and can result in higher quality software. Hackathons are one of the best ways to collaboratively solve problems, often with code. Cross-functional two-pizza teams compete with limited resources under time constraints to solve a challenging business problem. Several companies have adopted the concept of hackathons to foster a culture of innovation, providing a platform for developers to showcase their creativity and acquire new skills. Teams are either provided a roster of ideas to choose from or come up with their own new idea. Solution overview In this post, you create an AWS Cloud9 environment shared with three AWS Identity and Access Management (IAM) users (the hackathon team). You also see how this team can code together to develop a sample serverless application using an AWS Serverless Application Model (AWS SAM) template. The following diagram illustrates the deployment architecture. Figure1: Solution Overview Prerequisites To complete the steps in this post, you need an AWS account with administrator privileges. Set up the environment To start setting up your environment, complete the following steps: Create an AWS Cloud9 environment in your AWS account. Create and attach an instance profile to AWS Cloud9 to call AWS services from an environment.For more information, see Create and store permanent access credentials in an environment. On the AWS Cloud9 console, select the environment you just created and choose View details. Figure2: Cloud9 View details Note the environment ID from the Environment ARN value; we use this ID in a later step. Figure3: Environment ARN In your AWS Cloud9 terminal, create the file usersetup.sh with the following contents: #USAGE: #STEP 1: Execute following command within Cloud9 terminal to retrieve environment id # aws cloud9 list-environments #STEP 2: Execute following command by providing appropriate parameters: -e ENVIRONMENTID -u USERNAME1,USERNAME2,USERNAME3 # sh usersetup.sh -e 877f86c3bb80418aabc9956580436e9a -u User1,User2 function usage() { echo "USAGE: sh usersetup.sh -e ENVIRONMENTID -u USERNAME1,USERNAME2,USERNAME3" } while getopts ":e:u:" opt; do case $opt in e) if ! aws cloud9 describe-environment-status --environment-id "$OPTARG" 2>&1 >/dev/null; then echo "Please provide valid cloud9 environmentid." usage exit 1 fi environmentId="$OPTARG" ;; u) if [ "$OPTARG" == "" ]; then echo "Please provide comma separated list of usernames." usage exit 1 fi users="$OPTARG" ;; \?) echo "Incorrect arguments." usage exit 1;; esac done if [ "$OPTIND" -lt 5 ]; then echo "Missing required arguments." usage exit 1 fi IFS=',' read -ra userNames <<< "$users" groupName='HackathonUsers' groupPolicy='arn:aws:iam::aws:policy/AdministratorAccess' userArns=() function createUsers() { userList="" if aws iam get-group --group-name $groupName > /dev/null 2>&1; then echo "$groupName group already exists." else if aws iam create-group --group-name $groupName 2>&1 >/dev/null; then echo "Created user group - $groupName." else echo "Error creating user group - $groupName." exit 1 fi fi if aws iam attach-group-policy --policy-arn $groupPolicy --group-name $groupName; then echo "Attached group policy." else echo "Error attaching group policy to - $groupName." exit 1 fi for userName in "${userNames[@]}" ; do randomPwd=`aws secretsmanager get-random-password \ --require-each-included-type \ --password-length 20 \ --no-include-space \ --output text` userList="$userList"$'\n'"Username: $userName, Password: $randomPwd" userArn=`aws iam create-user \ --user-name $userName \ --query 'User.Arn' | sed -e 's/\/.*\///g' | tr -d '"'` userArns+=( $userArn ) aws iam wait user-exists \ --user-name $userName echo "Successfully created user $userName." aws iam create-login-profile \ --user-name $userName \ --password $randomPwd \ --password-reset-required 2>&1 >/dev/null aws iam add-user-to-group \ --user-name $userName \ --group-name $groupName done echo "Waiting for users profile setup..." sleep 8 for arn in "${userArns[@]}" ; do aws cloud9 create-environment-membership \ --environment-id $environmentId \ --user-arn $arn \ --permissions read-write 2>&1 >/dev/null done echo "Following users have been created and added to $groupName group." echo "$userList" } createUsers Run the following command by replacing the following parameters: ENVIRONMENTID – The environment ID you saved earlier USERNAME1, USERNAME2… – A comma-separated list of users. In this example, we use three users. sh usersetup.sh -e ENVIRONMENTID -u USERNAME1,USERNAME2,USERNAME3 The script creates the following resources: The number of IAM users that you defined The IAM user group HackathonUsers with the users created from previous step assigned with administrator access These users are assigned a random password, which must be changed before their first login. User passwords can be shared with your team from the AWS Cloud9 Terminal output. Instruct your team to sign in to the AWS Cloud9 console open the shared environment by choosing Shared with you. Figure4: Shared environments Run the create-repository command, specifying a unique name, optional description, and optional tags: aws codecommit create-repository --repository-name hackathon-repo --repository-description "Hackathon repository" --tags Team=hackathon Note the cloneUrlHttp value from the output; we use this in a later step. Figure5: CodeCommit repo url The environment is now ready for the hackathon team to start coding. Instruct your team members to open the shared environment from the AWS Cloud9 dashboard. For demo purposes, you can quickly create a sample Python-based Hello World application using the AWS SAM CLI Run the following commands to commit the files to the local repo: cd hackathon-repo git config --global init.defaultBranch main git init git add . git commit -m "Initial commit Run the following command to push the local repo to AWS CodeCommit by replacing CLONE_URL_HTTP with the cloneUrlHttp value you noted earlier: git push <CLONEURLHTTP> —all For a sample collaboration scenario, watch the video Collaboration with Cloud9 . Clean up The cleanup script deletes all the resources it created. Make a local copy of any files you want to save. Create a file named cleanup.sh with the following content: #USAGE: #STEP 1: Execute following command within Cloud9 terminal to retrieve envronment id # aws cloud9 list-environments #STEP 2: Execute following command by providing appropriate parameters: -e ENVIRONMENTID -u USERNAME1,USERNAME2,USERNAME3 # sh cleanup.sh -e 877f86c3bb80418aabc9956580436e9a -u User1,User2 function usage() { echo "USAGE: sh cleanup.sh -e ENVIRONMENTID -u USERNAME1,USERNAME2,USERNAME3" } while getopts ":e:u:" opt; do case $opt in e) if ! aws cloud9 describe-environment-status --environment-id "$OPTARG" 2>&1 >/dev/null; then echo "Please provide valid cloud9 environmentid." usage exit 1 fi environmentId="$OPTARG" ;; u) if [ "$OPTARG" == "" ]; then echo "Please provide comma separated list of usernames." usage exit 1 fi users="$OPTARG" ;; \?) echo "Incorrect arguments." usage exit 1;; esac done if [ "$OPTIND" -lt 5 ]; then echo "Missing required arguments." usage exit 1 fi IFS=',' read -ra userNames <<< "$users" groupName='HackathonUsers' groupPolicy='arn:aws:iam::aws:policy/AdministratorAccess' function cleanUp() { echo "Starting cleanup..." groupExists=false if aws iam get-group --group-name $groupName > /dev/null 2>&1; then groupExists=true else echo "$groupName does not exist." fi for userName in "${userNames[@]}" ; do if ! aws iam get-user --user-name $userName >/dev/null 2>&1; then echo "$userName does not exist." else userArn=$(aws iam get-user \ --user-name $userName \ --query 'User.Arn' | tr -d '"') if $groupExists ; then aws iam remove-user-from-group \ --user-name $userName \ --group-name $groupName fi aws iam delete-login-profile \ --user-name $userName if aws iam delete-user --user-name $userName ; then echo "Succesfully deleted $userName" fi aws cloud9 delete-environment-membership \ --environment-id $environmentId --user-arn $userArn fi done if $groupExists ; then aws iam detach-group-policy \ --group-name $groupName \ --policy-arn $groupPolicy if aws iam delete-group --group-name $groupName ; then echo "Succesfully deleted $groupName user group" fi fi echo "Cleanup complete." } cleanUp Run the script by passing the same parameters you passed when setting up the script: sh cleanup.sh -e ENVIRONMENTID -u USERNAME1,USERNAME2,USERNAME3 Delete the CodeCommit repository by running the following commands in the root directory with the appropriate repository name: aws codecommit delete-repository —repository-name hackathon-repo rm -rf hackathon-repo You can delete the Cloud9 environment when the event is over Conclusion In this post, you saw how to use an AWS Cloud9 IDE to collaborate as a team and code together to develop a working prototype. For organizations looking to host hackathon events, these tools can be a powerful way to deliver a rich user experience. For more information about AWS Cloud9 capabilities, see the AWS Cloud9 User Guide. If you plan on using AWS Cloud9 for an ongoing collaboration, refer to the best practices for sharing environments in Working with shared environment in AWS Cloud9. About the authors Mahesh Biradar is a Solutions Architect at AWS. He is a DevOps enthusiast and enjoys helping customers implement cost-effective architectures that scale. Guy Savoie is a Senior Solutions Architect at AWS working with SMB customers, primarily in Florida. In his role as a technical advisor, he focuses on unlocking business value through outcome based innovation. Ramesh Chidirala is a Solutions Architect focused on SMB customers in the Central region. He is passionate about helping customers solve challenging technical problems with AWS and help them achieve their desired business outcomes. View the full article
  • Forum Statistics

    63.7k
    Total Topics
    61.7k
    Total Posts
×
×
  • Create New...