Jump to content

Search the Community

Showing results for tags 'trends'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

There are no results to display.

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 16 results

  1. The landscape of application development is rapidly evolving, propelled by the integration of Artificial Intelligence (AI) into the development process. Results in the Docker AI Trends Report 2024, a precursor to the upcoming State of Application Development Report, show interesting AI trends among developers, highlighted in this report. The most recent Docker State of Application Development Survey results offer insights into how developers are adopting and utilizing AI, reflecting a shift toward more intelligent, efficient, and adaptable development methodologies. This transformation is part of a larger trend observed across the tech industry as AI becomes increasingly central to software development. The annual Docker State of Application Development survey, conducted by our User Research Team, is one way Docker product managers, engineers, and designers gather insights from Docker users to continuously develop and improve the suite of tools the company offers. For example, in Docker’s 2022 State of Application Development Survey, we found that the task for which Docker users most often refer to support/documentation was creating a Dockerfile (reported by 60% of respondents). This finding helped spur the innovation of Docker AI. More than 1,300 developers participated in the latest Docker State of Application Development survey, conducted in late 2023. The online survey asked respondents about what tools they use, their application development processes and frustrations, feelings about industry trends, Docker usage, and participation in developer communities. We wanted to know where developers are focused, what they’re working on, and what is most important to them. Of the approximately 1,300 respondents to the survey, 885 completed it. The findings in this report are based on the 885 completed responses. Who responded to the Docker survey? Respondents who took our survey ranged from home hobbyists to professionals at companies with more than 5,000 employees. Forty-two percent of respondents are working for a small company (up to 100 employees), 28% of participants say they work for mid-sized companies (between 100 and 1,000 employees), and 25% work for large companies (more than 1,000 employees). Well over half of the respondents were in engineering roles — for example, 36% of respondents identified as back-end or full-stack developers; 21% were DevOps, infrastructure managers, or platform engineers; and 4% were front-end developers. Other roles of respondents included dev/engineering managers, company leadership, product managers, security roles, and AI/ML roles. There was nearly an even split between respondents with more experience (6+ years, 54%) and less experienced (0-5 years, 46%). Our survey underscored a marked growth in roles focused on machine learning (ML) engineering and data science within the Docker ecosystem. In our 2022 survey, approximately 1% of respondents represented this demographic, whereas they made up 8% in the most recent survey. ML engineers and data scientists represent a rapidly expanding user base. This signals the growing relevance of AI to the software development field, and the blurring of the lines between tools used by developers and tools used by AI/ML scientists. More than 34% of respondents said they work in the computing or IT/SaaS industry, but we also saw responses from individuals working in accounting, banking, or finance (8%); business, consultancy, or management (7%); engineering or manufacturing (6%), and education (5%). Other responses came in from professionals in a wide range of fields, including media; academic research; transport or logistics; retail; marketing, advertising, or PR; charity or volunteer work; healthcare; construction; creative arts or design; and environment or agriculture. Docker users made up 87% of our respondents, whereas 13% reported that they do not use Docker. AI as an up-and-coming trend We asked participants what they felt were the most important trends currently in the industry. GenAI (40% of respondents) and AI assistants for software engineering (38% of respondents) were the top-selected options identified as important industry trends in software development. More senior developers (back-end, front-end, and full-stack developers with over 5 years of experience) tended to view GenAI as most important, whereas more junior developers (less than 5 years of experience) view AI assistants for software engineering as most important. This difference may signal varied and unique uses of AI throughout a career in software development. It’s clearly trendy, but how do developers really feel about AI? The majority (65%) agree that AI is a positive option, it makes their jobs easier (61%), and it allows them to focus on more important tasks (55%). A much smaller number of respondents see AI as a threat to their jobs (23%) or say it makes their jobs more difficult (19%). Interestingly, despite high usage and generally positive feelings towards AI, 45% of respondents also reported that they feel AI is over-hyped. Why might this be? It’s not fully clear, but when this finding is considered alongside responses to perception of job threat, one possible answer could be entertained: respondents may be viewing AI as a critical and useful tool for their work, but they’re not too worried about the hype of it replacing them anytime soon. How AI is used in the developer’s world We asked users what they use AI for, how dependent they feel on AI, and what AI tools they use most often. A majority of developers (64%) already report using AI for work, underscoring AI’s penetration into the software development field. Developers leverage AI at work mainly for coding (33% of respondents), writing documentation (29%), research (28%), writing tests (23%), troubleshooting/debugging (21%), and CLI commands (20%). For the 568 respondents who indicated they use AI for work, we also asked how dependent they felt on AI to get their job done on a scale of 0 (not at all dependent) to 10 (completely dependent). Responses ranged substantially and varied by role and years of experience, but the overall average reported dependence was about 4 out of 10, indicating relatively low dependence. In the developer toolkit, respondents indicate that AI tools like ChatGPT (46% of respondents), GitHub Copilot (30%), and Bard (19%) stand out as most frequently used. Conclusion Concluding our 2024 Docker AI Trends Report, Artificial Intelligence is already shifting the way software development is approached. The insights from more than 800 respondents in our latest survey illuminate a path toward a future where AI is seamlessly integrated into every aspect of application development. From coding and documentation to debugging and writing tests, AI tools are becoming indispensable in enhancing efficiency and problem-solving capabilities, allowing developers to focus on more creative and important work. The uptake of AI tools such as ChatGPT, GitHub Copilot, and Bard among developers is a testament to AI’s value in the development process. Moreover, the growing interest in machine learning engineering and data science within the Docker community signals a broader acceptance and integration of AI technologies. As Docker continues to innovate and support developers in navigating these changes, the evolving landscape of AI in software development presents both opportunities and challenges. Embracing AI as a positive force that can augment human capabilities rather than replace them is crucial. Docker is committed to facilitating this transition by providing tools and resources that empower developers to leverage AI effectively, ensuring they can remain at the forefront of technological innovation. Looking ahead, Docker will continue to monitor these trends, adapt our offerings accordingly, and support our user community in harnessing the full potential of AI in software development. As the industry evolves, so too will Docker’s role in shaping the future of application development, ensuring our users are equipped to meet the challenges and seize the opportunities that lie ahead in this exciting era of AI-driven development. Learn more Introducing a New GenAI Stack: Streamlined AI/ML Integration Made Easy Get started with the GenAI Stack: Langchain + Docker + Neo4j + Ollama Building a Video Analysis and Transcription Chatbot with the GenAI Stack Docker Partners with NVIDIA to Support Building and Running AI/ML Applications Build Multimodal GenAI Apps with OctoAI and Docker How IKEA Retail Standardizes Docker Images for Efficient Machine Learning Model Deployment Case Study: How Docker Accelerates ZEISS Microscopy’s AI Journey Docker AI: From Prototype to CI/CD Pipeline solutions brief Containerize a GenAI app (use case guide) Docker’s User Research Team — Olga Diachkova, Julia Wilson, and Rebecca Floyd — conducted this survey, analyzed the results, and provided insights. For a complete methodology, contact uxresearch@docker.com. View the full article
  2. Software engineers are expected to face increasing complexity in their work due to new IT View the full article
  3. Discover what industry experts think the events of Q1 mean for the business cyber security landscape in the UK.View the full article
  4. It is more important than ever to have efficient communication, continuous integration, and fast delivery in the quickly changing field of software development.View the full article
  5. Predicting the future of cybersecurity is an impossible task, but getting some expert advice doesn’t... The post Webinar Recap: Cybersecurity Trends to Watch in 2024 appeared first on Security Boulevard. View the full article
  6. This is part of a larger series on the new infrastructure of the era of AI, highlighting emerging technology and trends in large-scale compute. This month, we’re sharing the 2024 edition of the State of AI Infrastructure report to help businesses harness the power of AI now. The era of AI is upon us. You’ve heard about the latest advancements in our technology, the new AI solutions powered by Microsoft, our partners, and our customers, and the excitement is just beginning. To continue the pace of these innovations, companies need the best hardware that matches the workloads they are trying to run. This is what we call purpose-built infrastructure for AI—it’s infrastructure that is customized to meet your business needs. Now, let’s explore how Microsoft cloud infrastructure has evolved to support these emerging technologies. The State of AI Infrastructure An annual report on trends and developments in AI infrastructure based on Microsoft commissioned surveys conducted by Forrester Consulting and Ipsos Read the report Looking back at Microsoft’s biggest investments in AI infrastructure 2023 brought huge advancements in AI infrastructure. From new virtual machines to updated services, we’ve paved the way for AI advancements that include custom-built silicon and powerful supercomputers. Some of the highlights of Microsoft AI infrastructure innovations in 2023 include: Launching new Azure Virtual Machines powered by AMD Instinct and NVIDIA Hopper graphics processing units (GPUs), optimized for different AI and high-performance computing (HPC) workloads, such as large language models, mid-range AI training, and generative AI inferencing. Introducing Azure confidential VMs with NVIDIA H100 GPUs—enabling secure and private AI applications on the cloud. Developing custom-built silicon for AI and enterprise workloads, such as Azure Maia AI accelerator series, an AI accelerator chip, and Azure Cobalt CPU series, a cloud-native chip based on Arm architecture. Building the third most powerful supercomputer in the world, Azure Eagle, with 14,400 NVIDIA H100 GPUs and Intel Xeon Sapphire Rapids processors and achieving the second best MLPerf Training v3.1 record submission using 10,752 H100 GPUs. Understanding the state of AI and demand for new infrastructure 2024 is shaping up to be an even more promising year for AI than its predecessor. With the rapid pace of technological advancements, AI infrastructure is becoming more diverse and widespread than ever before. From cloud to edge, CPUs to GPUs, and application-specific integrated circuits (ASICs), the AI hardware and software landscape is expanding at an impressive rate. To help you keep up with the current state of AI, its trends and challenges, and to learn about best practices for building and deploying scalable and efficient AI systems, we’ve recently published our Microsoft Azure: The State of AI Infrastructure report. The report addresses the following key themes: Using AI for organizational and personal advancement AI is revolutionizing the way businesses operate, with an overwhelming 95% of organizations planning to expand their usage in the next two years. Recent research commissioned by Microsoft highlights the role of AI in driving innovation and competition. Beyond mandates, individuals within these organizations recognize the value AI brings to their roles and the success of their companies. IT professionals are at the forefront of AI adoption and use, with 68% of those surveyed already implementing it in their professional work. But it doesn’t stop there—AI is also being used in their personal lives, with 66% of those surveyed incorporating it into their daily routines. AI’s transformative potential spans across industries, from improving diagnostic accuracy in healthcare to optimizing customer service through intelligent chatbots. As AI shapes the future of work, it’s essential for organizations to embrace its adoption to stay competitive in an ever-evolving business landscape. Navigating from AI exploration to implementation The implementation of AI in businesses is still in its early stages, with one-third of companies exploring and planning their approach. However, a significant segment has progressed to pilot testing, experimenting with AI’s capabilities in real-world scenarios. They’re taking that next critical step towards full-scale implementation. This phase is crucial as it allows businesses to gauge the effectiveness of AI, tailor it to their specific needs, and identify any potential issues before a wider rollout. Because of this disparity in adoption, organizations have a unique opportunity to differentiate themselves and gain a competitive advantage by accelerating their AI initiatives. However, many organizations will need to make significant tech and infrastructure changes before they can fully leverage AI’s benefits. Those who can quickly navigate from exploration to implementation will establish themselves as leaders in leveraging AI for innovation, efficiency, and enhanced decision-making. Acknowledging challenges of building and maintaining AI infrastructure To fully leverage AI’s potential, companies need to ensure they have a solid foundation to support their AI strategies and drive innovation. Like the transportation industry, a solid infrastructure to manage everyday congestion is crucial. However, AI infrastructure skilling remains the largest challenge, both within companies and in the job market. This challenge is multifaceted, encompassing issues such as the complexity of orchestrating AI workloads, a shortage of skilled personnel to manage AI systems, and the rapid pace at which AI technology evolves. These hurdles can impede an organization’s ability to fully leverage AI’s potential, leading to inefficiencies and missed opportunities. Leveraging partners to accelerate AI innovation Strategic partnerships play a pivotal role in the AI journey of organizations. As companies delve deeper into AI, they often seek out solution providers with deep AI expertise and a track record of proven AI solutions. These partnerships are instrumental in accelerating AI production and addressing the complex challenges of AI infrastructure. Partners are expected to assist with a range of needs, including infrastructure design, training, security, compliance, and strategic planning. As businesses progress in their AI implementation, their priorities shift towards performance, optimization, and cloud provider integration. Engaging the right partner can significantly expedite the AI journey for businesses of any size and at any stage of AI implementation. This presents a substantial opportunity for partners to contribute, but it also places a responsibility on them to ensure their staff is adequately prepared to provide consulting, strategy, and training services. Discover more To drive major AI innovation, companies must overcome many challenges at a breakneck pace. Our insights in The State of AI Infrastructure report underscore the need for a strategic approach to building and maintaining AI infrastructure that is agile, scalable, and capable of adapting to the latest technological advancements. By addressing these infrastructure challenges, companies can ensure they have a solid foundation to support their AI strategies and drive innovation. Download The State of AI Infrastructure report References Annual Roundup of AI Infrastructure Breakthroughs for 2023 #AIInfraMarketPulse The post New infrastructure for the era of AI: Emerging technology and trends in 2024 appeared first on Microsoft Azure Blog. View the full article
  7. Written by: Maddie Stone, Jared Semrau, James Sadowski Combined data from Google’s Threat Analysis Group (TAG) and Mandiant shows 97 zero-day vulnerabilities were exploited in 2023; a big increase over the 62 zero-day vulnerabilities identified in 2022, but still less than 2021's peak of 106 zero-days. This finding comes from the first-ever joint zero-day report by TAG and Mandiant. The report highlights 2023 zero-day trends, with focus on two main categories of vulnerabilities. The first is end user platforms and products such as mobile devices, operating systems, browsers, and other applications. The second is enterprise-focused technologies such as security software and appliances. Key zero-day findings from the report include: Vendors' security investments are working, making certain attacks harder. Attacks increasingly target third-party components, affecting multiple products. Enterprise targeting is rising, with more focus on security software and appliances. Commercial surveillance vendors lead browser and mobile device exploits. People’s Republic of China (PRC) remains the top state-backed exploiter of zero-days. Financially-motivated attacks proportionally decreased. Threat actors are increasingly leveraging zero-days, often for the purposes of evasion and persistence, and we don’t expect this activity to decrease anytime soon. Progress is being made on all fronts, but zero-day vulnerabilities remain a major threat. A Look Back — 2023 Zero-Day Activity at a Glance Barracuda ESG: CVE-2023-2868 Barracuda disclosed in May 2023 that a zero-day vulnerability (CVE-2023-2868) in their Email Security Gateway (ESG) had been actively exploited since as early as October 2022. Mandiant investigated and determined that UNC4841, a suspected Chinese cyber espionage actor, was conducting attacks across multiple regions and sectors as part of an espionage campaign in support of the PRC. Mandiant released a blog post with findings from the initial investigation, a follow-up post with more details as the investigation continued, and a hardening guide. Barracuda also released a detailed advisory with recommendations. VMware ESXi: CVE-2023-20867 Mandiant discovered that UNC3886, a Chinese cyber espionage group, had been exploiting a VMware zero-day vulnerability (CVE-2023-20867) in a continued effort to evade security solutions and remain undiscovered. The investigation shined a big light on UNC3886's deep understanding and technical knowledge of ESXi, vCenter and VMware’s virtualization platform. Mandiant released a blog post detailing UNC3886 activity involving exploitation of this zero-day vulnerability, and also detection, containment and hardening opportunities to better defend against the threat. VMware also released an advisory with recommendations. MOVEit Transfer: CVE-2023-34362 Mandiant observed a critical zero-day vulnerability in Progress Software's MOVEit Transfer file transfer software (CVE-2023-34362) being actively exploited for data theft since as early as May 27, 2023. Mandiant initially attributed the activity to UNC4857, which was later merged into FIN11 based on targeting, infrastructure, certificate and data leak site overlaps. Mandiant released a blog post with details on the activity, as well as a containment and hardening guide to help protect against the threat. Progress released an advisory with details and recommendations. Takeaways Zero-day exploitation has the potential to be high impact and widespread, as evidenced by the three examples shared in this post. Vendors must continue investing in security to reduce risk for their users and customers, and organizations across all industry verticals must remain vigilant. Zero-day attacks that get through defenses can result in significant financial losses, reputational damage, data theft, and more. While zero-day threats are difficult to defend against, a defense in depth approach to security can help reduce potential impact. Organizations should focus on sound security principles such as vulnerability management, network segmentation, least privilege, and attack surface reduction. Additionally, defenders should conduct proactive threat hunting, and follow guidance and recommendations provided by security organizations. Read the report now to learn more about the zero-day landscape in 2023. View the full article
  8. A company’s data strategy is always in motion. Since the explosion of interest in generative AI and large language models (LLMs), that is more true than ever, with business leaders discussing how quickly they should adopt these technologies to stay competitive. Some emerging approaches may be seen in our newly released Snowflake Data Trends 2024, looking at how users in the Data Cloud are working with their data. Understanding which features, languages and approaches our users embrace provides real indications of how organizations are preparing their data for this fast-moving new age of advanced AI. For the report, we focused on two broad aspects of data strategy. We looked at activity around LLMs and the applications that work with them. But before a company can be successful with generative AI, LLMs and other innovative technologies, it has to build a strong data foundation. So we first looked at foundational activities that could suggest whether organizations are shifting or accelerating the work of preparing for advanced new technologies. We saw that collectively, organizations are definitely preparing their data to be used more effectively with powerful, new AI technologies. The most marked finding was around governance. Strong data governance is essential to meet security and compliance obligations, but it is often regarded as a hindrance. If IT “locks down” the data, it can’t be used to derive insight and refine strategy — or so the complaint often goes. Looking at activity in the Data Cloud, we found quite the opposite. Application of individual governance features (tags applied to data objects, masking policies assigned to data sets, etc.) rose between 72% and 98% from January 2023 to January 2024. The cumulative number of queries run against this policy-protected data rose as well, by 142%. This suggests that even as organizations increase the granularity of their data governance practices, they’re able to do more, not less, with the data. The report also covers increased usage of AI-friendly programming language Python, which grew by 571%. We also saw a lot more work with unstructured data, which has great AI potential, since estimates consistently put the share of all data that’s unstructured at 80% to 90%. Looking at AI work and applications, our most exciting activity was from our Streamlit developer community. From April 2023 to January 2024, more than 20,000 developers worked on 33,000+ LLM applications, including apps in development. Across that period, the percentage of such apps that were chatbots increased notably, from 18% in April to 46% by the end of January. We would expect much of this to have been experimentation and pilot projects, but the fact that so many devs are eager to work with these complex AI models only confirms the expectation that a transformative wave of innovation is beginning. Further insights into the potential shape of the future: The fast-growing adoption of the Snowflake Native App Framework (generally available on AWS and Azure, private preview on GCP), since it entered public preview last summer, tells us that our mantra of bringing computation to a single, secure set of data — rather than farming out copies of your data to various environments — resonates with our users. When we look at how the features of the Data Cloud are actually being used, we not only refine our own product plans, we also see the trends and the future of enterprise data. We continue to innovate at Snowflake to shape that future. I hope that this report gives business and IT leaders ideas and indicators that will help them shape their data strategies within the Data Cloud and beyond. For more, read Snowflake Data Trends 2024. The post Data Trends 2024: Strategies for an AI-Ready Data Foundation appeared first on Snowflake. View the full article
  9. As the year rolls on, here are a few key DevOps trends to watch for in 2024View the full article
  10. This article identifies some basic trends in the software industry. Specifically, we will explore how some well-known organizations implement and benefit from early and continuous testing, faster software delivery, reduced costs, and increased collaboration. While it is clear that activities like breaking down silos, shift-left testing, automation, and continuous delivery are interrelated, it is beneficial to take a look at how companies strive to achieve such goals in practice. Companies try to break down the traditional silos that separate development, operations, and testing teams. This eliminates barriers and fosters collaboration, where all teams share responsibility for quality throughout the software development lifecycle. This collaborative approach leads to improved problem-solving, faster issue resolution, and ultimately, higher-quality software. View the full article
  11. The fintech landscape is ever-evolving, driven by tech advancements, regulatory changes, and shifts in consumer behaviorView the full article
  12. From artificial intelligence to 5G, 2024 could be a pivotal year for the tech industry and workers in the U.K.View the full article
  13. Here are six distinct technology trends that are poised to be particularly influential for DevOps in the new year.View the full article
  14. Online and remote labor growth increases businesses' workforce potential but dilutes data across platforms.View the full article
  15. The rise of Kubernetes, cloud-native, and microservices spawned major changes in architectures and abstractions that developers use to create modern applications. In this multi-part series, I talk with some of the leading experts across various layers of the stack — from networking infrastructure to application infrastructure and middleware to telemetry data and modern observability concerns — to understand emergent platform engineering patterns that are affecting developer workflow around cloud-native. The next participant in our series is Tom Wilkie, CTO at Grafana Labs, where he leads engineering for Grafana Cloud... View the full article
  16. Here are 5 trends that startups should keep an eye on ... https://www.snowflake.com/blog/five-trends-changing-startup-ecosystem/
  • Forum Statistics

    67.4k
    Total Topics
    65.3k
    Total Posts
×
×
  • Create New...