Jump to content

Search the Community

Showing results for tags 'cli'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

There are no results to display.

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 25 results

  1. List of all command with options for /opt/lampp/lampp sudo /opt/lampp/lampp start # Start XAMPP (Apache, MySQL, and any other services included) sudo /opt/lampp/lampp stop # Stop XAMPP sudo /opt/lampp/lampp restart # Restart XAMPP sudo /opt/lampp/lampp startapache # Start only the Apache service sudo /opt/lampp/lampp stopapache # Stop the Apache service sudo /opt/lampp/lampp startmysql # Start only the MySQL/MariaDB service sudo /opt/lampp/lampp stopmysql # Stop the MySQL/MariaDB service sudo /opt/lampp/lampp startftp # Start the ProFTPD service sudo /opt/lampp/lampp stopftp # Stop the ProFTPD service sudo /opt/lampp/lampp security # Run a simple security check script sudo /opt/lampp/lampp enablessl # Enable SSL support for Apache sudo /opt/lampp/lampp disablessl # Disable SSL support for Apache sudo /opt/lampp/lampp backup # Create a simple backup of your XAMPP configuration, data, and logs sudo /opt/lampp/lampp status # Show status of XAMPP services sudo /opt/lampp/lampp reload # Reload XAMPP (Apache and MySQL reload configuration without stopping the server) sudo /opt/lampp/lampp reloadapache # Reload only the Apache service sudo /opt/lampp/lampp reloadmysql # Reload only the MySQL/MariaDB service sudo /opt/lampp/lampp reloadftp # Reload the ProFTPD service sudo /opt/lampp/lampp enablephpmyadmin # Enable phpMyAdmin access from network (modify permissions) sudo /opt/lampp/lampp disablephpmyadmin# Disable phpMyAdmin access from network sudo /opt/lampp/lampp phpstatus # Show PHP status (e.g., for checking PHP-FPM status) sudo /opt/lampp/lampp clean # Clean XAMPP (clears temporary files and logs) The post Lampp commands line reference with example appeared first on DevOpsSchool.com. View the full article
  2. We are pleased to announce the release of HashiCorp Boundary 0.15, which adds session recording storage policies (HCP Plus/Enterprise) and desktop/CLI client improvements like search and filtering. Boundary is a modern privileged access management (PAM) solution that was designed for and thrives in dynamic environments. Boundary streamlines end user access to infrastructure resources without exposing credentials or internal network topologies. Recent initiatives were aimed to improve governance and useability. As a result, previous releases included features like SSH session recording and an embedded terminal in the desktop client. We continue this effort in our latest 0.15 release and are excited for users to try it out themselves. Session recording storage policies (HCP Plus/Enterprise) Introduced in Boundary 0.13, SSH session recording helped organizations meet their compliance objectives by recording detailed end user activities and commands. Those SSH session recordings are then stored in the organization’s designated Amazon S3 buckets. Boundary 0.15 improves storage governance by allowing administrators to set retention and deletion policies for session recordings. This helps ensure that recordings are available and accessible for the desired retention period, ensuring that teams can meet various regulatory requirements. This feature also helps reduce management and storage costs by automatically deleting recordings at the designated time and date. Improvements to the Boundary Desktop/CLI client Boundary 0.15 improvements include search and filtering capabilities, session time indicators, and support for ARM64 architectures. Search and filtering Recent improvements to the Boundary Desktop client have dramatically simplified the end user experience. However, at a large scale, some end users may be authorized to connect to tens or hundreds of target resources. This makes it difficult to locate a specific target in a long list. Similarly, finding a specific session among tens or hundreds of active sessions can also be challenging. The desktop and CLI client in Boundary 0.15 includes new search and filter capabilities to help users locate their desired targets and sessions. Users simply search for the full or partial names or IDs of the desired target and can further narrow down the results by filtering out the scopes or session states (active, pending, or terminated). Larger result sets are paginated for improved search performance. We expect this subtle addition to noticeably improve the user experience and reduce the time it takes to locate and connect to a target. Session time indicator Our goal with Boundary Desktop is to centralize the experience of connecting to any resource on your network, for any type of user. Upon establishing a session, end users often can’t tell how long their sessions will last. That information has now been added in version 1.8 of the Boundary Desktop client. A time-remaining helper now appears at the top of the session, giving users a sense of how long their session will be valid for. This also paves the way for future features, such as approvals and session renewals. Support for ARM64 architectures Prior to this release, Boundary did not support Darwin ARM64/Apple silicon builds. Version 1.8 of the Boundary Desktop client, adds support for ARM64 architectures. Download the Boundary Desktop client here. Minor improvements and bug fixes We have also made various minor improvements and addressed bugs uncovered since the latest release. Improvements include grant scopes for roles and new commands for the CLI which simplify and reduce the required sub-commands. For more information, please view the changelog. Get started with Boundary 0.15 We are excited for users to try the new governance and usability features available in Boundary 0.15. Administrators can deploy a HashiCorp-managed Boundary cluster using the HashiCorp Cloud Platform (HCP) or a self-managed Boundary cluster using Boundary’s Community or Enterprise versions. Check out these resources to get started: Sign up for a free HCP Boundary account. For self-managed versions, download Boundary 0.15. Download the free Boundary Desktop client. Watch this Getting Started with HCP Boundary demo video. Get up and running quickly with our Get Started with HCP Boundary tutorial. Read the documentation for storage policies and Boundary CLI search functions. To request a Boundary Enterprise trial, contact HashiCorp sales. View the full article
  3. Learn how to use the scp command to transfer files securely with this step-by-step tutorial by expert Jack Wallen.View the full article
  4. Developers using SAM CLI to author their serverless application with Lambda functions can now create and use Lambda test events to test their function code. Test events are JSON objects that mock the structure of requests emitted by AWS services to invoke a Lambda function and return an execution result, serving to validate a successful operation or to identify errors. Previously, Lambda test events were only available in the Lambda console. With this launch, developers using SAM CLI can create and access a test event from their AWS account and share it with other team members. View the full article
  5. Docker cli is a command-line tool that makes it easy to use docker-related commands right from your terminal. You can use this tool to interact with the docker engine, can easily create, start, stop and delete docker containers and manage docker images and volumes. While docker does have a desktop version, some users still prefer using docker cli on Mac without the need for desktop version. Read this guide if you want to use docker cli on your Mac system without installing docker desktop.. View the full article
  6. In the dynamic world of DevOps and system administration, command-line proficiency is a crucial skill. Bash, one of the most widely used command-line shells - and the default for most Unix-based systems, including popular Linux distributions - offers immense power and versatility. Mastering Bash scripting can give you a competitive edge in the automation-reliant field of DevOps. This blog post, based on the Advanced Bash Scripting course offered by KodeKloud, serves as a comprehensive guide to mastering Bash scripting. Understanding Bash ScriptingBash scripts are essentially command-line programs written in the Bash shell language. They are used to automate tasks and execute commands in a series. Bash scripts simplify complex tasks and are often used in system administration and programming. The scripts are executed in a terminal window, and they can be created using any text editor. Why Advanced Bash Scripting?With bash scripting, you can write scripts that can perform complex operations, manipulate data, and interact with the system. It is a versatile language that can be used on almost any platform, making it an excellent choice for system administrators and developers. Learning bash scripting is an investment in your future, as it can help you work more efficiently and effectively. The KodeKloud CourseKodeKloud offers a comprehensive course on Advanced Bash Scripting, designed to equip learners with the knowledge and skills to effectively utilize Bash. The course covers Bash scripting conventions and best practices, working with variables, functions, and parameter expansions, understanding streams, and managing input/output redirection, among other topics. The course is tailored for visual learners seeking an engaging and up-to-date learning experience. It balances theory and practice perfectly to ensure learners easily grasp Bash’s intricate concepts. From a theory perspective, the course explores widely discussed concepts like using curly braces for variable expansion, file descriptors, and what POSIX compliance means, along with its implications for syntax choice. From the practical perspective, it includes guides for modern Bash features, including associative arrays using key-value pairs for accessing array elements, introductory tutorials for popular command-line utilities like awk and sed, and Labs for practicing each learned concept to complement the learning experience. By mastering the concepts covered in this course, you will enhance your Bash proficiency and gain the confidence to write superior and more robust scripts. You'll understand how to create, read and debug scripts. Additionally, you’ll master how to implement script logging and error handling. Enroll in the Advance Bash Scripting Course! The Power of Bash ScriptingBash scripts can help automate a wide range of tasks and manage system configurations, making your work more efficient and reliable. By taking the KodeKloud course, you will develop practical skills in Bash scripting, including writing robust scripts that follow best practices. You will also learn how to manage input/output redirection, work with variables and functions, and use parameter expansions. These valuable skills will enable you to effectively use Bash scripting in your own work. Advanced Bash Scripting ConceptsIn addition to learning practical skills in Bash scripting, the KodeKloud course covers advanced concepts that allow users to leverage the full power of Bash. These concepts include associative arrays that use key-value pairs to access array elements as well as introductory tutorials for popular command-line utilities like awk and sed. With this knowledge, users can perform complex text-processing tasks using Bash scripts. Career Opportunities with Bash Scripting MasteryBy mastering Bash scripting, you will be well-positioned to pursue career opportunities in software development, IT management, and DevOps engineering. It will open up unparalleled career opportunities, allowing you to prosper in the system administration and DevOps fields. Whether you're automating deployments, managing system configurations, or writing complex data analysis scripts, mastery of Bash scripting will be a valuable asset. Enroll Now! ConclusionIn conclusion, Bash scripting is a powerful tool that every DevOps professional and system administrator should master. The KodeKloud Advanced Bash Scripting course provides a comprehensive guide to mastering its application, covering everything from the basics to advanced concepts. So, are you ready to enhance your DevOps or SysAdmin skills and gain command-line mastery? Enroll in the KodeKloud course today and unlock the power of Advanced Bash Scripting! Here's to your DevOps journey! New to Linux and Scripting? Start with our beginner courses: Linux Basics CourseShell Scripts for BeginnersWant to certify your Linux skills? Check out our certification exam preparation courses: Linux Foundation Certified System Administrator (LFCS)Linux Professional Institute LPIC-1 Exam 101Red Hat Certified System Administrator(RHCSA) SUBSCRIBE to gain access to this comprehensive course as well as 65+ additional courses on Linux, DevOps, AWS, Azure, GCP, Kubernetes, Docker, Ansible, Terraform, Python, and more. Join us on this transformative educational adventure and unlock the power of the bash scripts. View the full article
  7. We are happy to announce the general availability of the new streamlined deployment experience for .NET applications. With sensible defaults for all deployment settings, you can now get your .NET application up and running in just one click, or with a few easy steps - without needing deep expertise in AWS. You will receive recommendations on the optimal compute for your application, giving you more confidence in your initial deployments. You can find it in the AWS Toolkit for Visual Studio using the new “Publish to AWS” wizard. It is also available via the .NET CLI by installing AWS Deploy Tool for .NET. Key capabilities: Compute recommendations - get the compute recommendations and learn which AWS compute is best suited for your application. Dockerfile generation – the Dockerfile will be auto-generated if required by your chosen AWS compute. Auto packaging and deployment – your application will be built and packaged as required by the chosen AWS compute. The tooling will provision the necessary infrastructure and deploy your application using AWS CDK. Repeatable and shareable deployments – you can generate well organized and documented AWS CDK deployment projects and start modifying them to fit your specific use-case. Then version control them and share with your team for repeatable deployments. CI/CD integration – turn off the interactive features and use different deployment settings to push the same application bundle to different environments. Help with learning AWS CDK for .NET! – gradually learn the underlying AWS tools that it is built on, such as the AWS CDK. View the full article
  8. Orca Security has extended its cloud security platform via a command-line interface (CLI) that makes it simpler to integrate with a wide range of DevOps tools. Rather than relying on agents, the Orca Security platform creates a risk profile using read-only access to block storage accessed via a runtime hosted on Amazon Web Services (AWS), […] View the full article
  9. In this post, we’ll demonstrate how you can leverage Amazon CodeGuru Reviewer Command Line Interface (CLI) to integrate CodeGuru Reviewer into your Jenkins Continuous Integration & Continuous Delivery (CI/CD) pipeline. Note that the solution isn’t limited to Jenkins, and it would be equally useful with any other build automation tool. Moreover, it can be integrated at any stage of your SDLC as part of the White-box testing. For example, you can integrate the CodeGuru Reviewer CLI as part of your software development process, as well as run it on your dev machine before committing the code... View the full article
  10. YAML (YAML Ain’t Markup Language) is a frequently used data serialization language that is often used as configs for tools such as Kubernetes, Jenkins and serverless framework, it is supported in some fashion by most popular programming languages. More often than not we are keeping these YAML files in version control…so how do we enforce a consistent format/style and ensure we always push up validated YAML files. If we have a strict schema, how would you check 100s of YAMLs in one batch or quickly check a single file on the CLI is valid. Why wait for your app to tell you a YAML file is wrong when deployed when you can do it quickly on the CLI? Read Article
  11. Today, we are excited to announce that the Amazon Genomics CLI v1.5.0 has added support for workflows written in the Common Workflow Language (CWL) using the Toil workflow engine. In addition to CWL, the Amazon Genomics CLI supports workflows written with Workflow Definition Language (WDL), Nextflow, and Snakemake enabling customers to run a wide variety of genomics data analyses like joint calling of genome variants and single-cell RNAseq. View the full article
  12. AWS CLI or Amazon Web Service Command Line Interface is a command-line tool for managing and administering your Amazon Web Services. AWS CLI provides direct access to the public API (Application Programming Interface) of Amazon Web Services. Since it’s a command-line tool, you can also use it to create scripts for automating your Amazon Web Services. In this article, I will show you how to install the AWS CLI program on Ubuntu 22.04 LTS using the APT package manager. I will also show you how to install the latest version of the AWS CLI program on Ubuntu 22.04 LTS as a Python module using Python PIP. So, let’s get started… View the full article
  13. We’re excited to announce the release of HashiCorp Terraform 1.2, now immediately available for download as well as for use in HashiCorp Terraform Cloud. Terraform is widely adopted as the standard for multi-cloud provisioning and automation for individuals and teams at any scale. This release introduces new ways to raise errors and validate dynamic module inputs, along with usability improvements for the cloud integration with CI/CD pipelines, and support for Terraform Cloud run tasks. For more details, please see the full HashiCorp Terraform 1.2 changelog. This release wouldn't have been possible without all of the great community feedback we've received via GitHub issues, as well as continued feedback from our customers. Thank you! View the full article
  14. A serverless application can be built using services provided by AWS, such as AWS Serverless Application Model (SAM). AWS provides AWS SAM CLI for developing applications based on SAM. It also facilitates an execution environment similar to Lambda for building, testing, and debugging applications provided by SAM templates. Also, CLI can deploy the SAM application to AWS using AWS SAM… View the full article
  15. strongDM today announced it has added an application programming interface (API) and software development kits for Go, Java, Python, Ruby and other programming languages for involving the single sign-on (SSO) capabilities of its infrastructure access management platform, in addition to adding support for command-line interfaces (CLIs) exposed by cloud service providers. At the same time, […] The post strongDM Extends Access Management as Code Efforts appeared first on DevOps.com. View the full article
  16. One of the essential tasks for developers and sysadmin is to get an alert notification about failed services or running out of disk space and other critical failures. Let us see how to send or push a direct message to a mobile device powered by Apple iOS or Google Android phone. The post How to push/send message to iOS and Android from Linux CLI appeared first on nixCraft. View the full article
  17. Amplify CLI helps front-end web & mobile developers provision APIs and host websites. With today’s Amplify CLI release, you gain the ability to deploy the GraphQL & REST APIs and host websites using AWS Fargate in addition to existing AppSync, API Gateway and Amplify console options. Just run the “amplify configure project” command and enable the “container-based deployments” option. View the full article
  18. A video card is a special circuit board that controls what is displayed on a computer monitor. It is also called a graphics processing unit (GPU), which calculates 3D images and graphics for Linux gaming and other usages. Let us see the top 7 Linux GPU monitoring and diagnostic command-line tools to solve issues. The post Top 7 Linux GPU Monitoring and Diagnostic Commands Line Tools appeared first on nixCraft. View the full article
  19. Steps # In a virtualenv $ pip install awscli # Globally $ sudo pip install awscli # For current user only $ pip install --user awscli # Upgrade $ pip install --upgrade awscli Related Installing & configuring a 'virtualenv'
  20. Customers can now share AMIs from Image Builder pipelines with AWS accounts in multiple AWS regions, using the AWS Command Line Interface (CLI). View the full article
  21. The Azure CLI can be updated from the command-line in Windows. The command az upgrade is used for this, and it has a few options which are useful. Like all Azure CLI (az) commands, there is help that can be revealed when running the command with the -h The two arguments that are shown with […] The article Update Azure CLI from the command-line (az upgrade) appeared first on Build5Nines. View the full article
  22. So you have already created a Virtual Machine in Azure that is provisioned with a Public IP address and you need to remove it. Don’t worry, you do NOT need to delete the VM. You can disassociate the Public IP from the Network Interface connected to the VM, and then delete the Public IP. This […] The article Azure CLI: Delete Public IP from Existing NIC / VM appeared first on Build5Nines. View the full article
  23. So you have already created a Virtual Machine in Azure that is provisioned with a Public IP address and you need to remove it. Don’t worry, you do NOT need to delete the VM. You can disassociate the Public IP from the Network Interface connected to the VM, and then delete the Public IP. This […] The article Azure CLI: Delete Public IP from Existing NIC / VM appeared first on Build5Nines. View the full article
  24. SORT command in Linux is used to arrange the record in a specific order according to the option used. It helps in sorting the data in the file line by line. SORT command has different features that it follows in the resultant of commands. First is that the lines having numbers will come before the alphabetic lines. Those lines having lower case letters will be displayed earlier than the lines having the same character in uppercase. Prerequisite: You need to install Ubuntu on a virtual box and configure it. Users must be created to have the privileges of accessing the applications. Syntax: Sort (options) (file) Example: This is a simple example of sorting a file having data of names. These names are not in order, and to make them in an order form you need to sort them. So, consider a file named file1.txt. We will display the contents in the file by using the appended command: $ Cat file1.txt Now use the command to sort the text in the file: $ sort file1.txt Save the Output in Another File By using the sort command, you will come to know that its result is only displayed but not saved. To capture the result we need to store it. For this purpose –o option in the sort command is used. Consider an example name sample1.txt having the names of cars. We want to sort them and save the resultant data in a separate file. A file named result.txt is created at run-time and the respective output is stored in it. The data of sample1.txt is transferred to the resultant file and then with the help of –o the respective data is sorted. We have displayed the data using the cat command: $ sort sample1.txt > result.txt $ sort –o result.txt sample1.txt $ Cat result.txt The output shows that the data is sorted and saved in another file. Sort for Column Number Sorting is not only done on a single column. We can sort one column because of the second column. Let us have an example of a text file in which there are names and marks of the students. We want to organize them in ascending order. So we will use the keyword –k in the command. Whereas –n is used for numerical sorting. $ sort –k 2n file3.txt As there are two columns, so 2 is used with n. Check the Sorted Condition of a File If you are not assured if the present file is sorted or not, remove this doubt using the command that clarifies the confusion and displays the message. We will come through two basic examples: Unsorted Data Now, consider an unsorted file having the vegetable names. The command will use the keyword –c. This will check whether the data in the file is sorted or not. If the data is unsorted, then the output will display the line number of the first word where unsortedness is present in the file and also the word. $ sort –c sample2.txt From the given output, you can understand that the 3rd word in the file was misplaced. Sorted Data In this case, when the data is already organized, there is no need to do anything else. Consider a file result.txt. $ sort –c result.txt From the result, you can see that no message is shown which indicates that the data in the respective file is already sorted. Remove Duplicate Items Here is the most useful option of some sort. This helps in removing the repeated words in a file and make the file item organized too. It also maintains the consistency of the data in the file. Consider the file name file2.txt having the names of subjects but one subject is repeated multiple times. Sort command will then use the –u keyword to remove duplication and relatedness: $ sort –u file2.txt Now, you can see that the repeated items are removed from the output and that the data is also sorted. Sort Using Pipe in a Command If we want to sort the data of the file by providing the list of the directory concerning the file sizes, we will enlist all respective data of the directory. The ‘ls’ is used in command and -l will display it. The Pipe will help in displaying the files in an organized manner. $ ls –l /home/aqsayasin/ | sort –nk5 Random Sorting Sometimes, while performing any function, you can mess with the arrangement. If you want to arrange the data in any sequence and if there are no criteria for sorting, then random sorting is preferred. Consider a file named sample3.txt having the names of the continents. $ sort sample3.txt -R The respective output shows that the file is sorted and items are arranged in a different order. Sort the Data of Multiple Files One of the most useful commands of sorting is to sort the data of different files at a time. This can be done by using the find command. The output of the find command will act as an input for the command after the pipe that is a sort command. Find keyword is used to give only one file on each line, or we can say that it uses a break after each word. For instance, let’s consider three files named sample1.txt, sample2.txt, and sample3.txt. Here the “?” represents any number that is followed by the word “sample”. Find will fetch all three files and their data will be sorted with the help of a sort command with the pipe initiative: $ find –name “sample?.txt” –print0 | sort –files0-from=- The output shows that the data of all sample.txt series files are displayed and is arranged and organized alphabetically. Sort with Join Now, we are introducing an example that is quite different from the ones that are discussed earlier in this tutorial. In addition to sort, we have used join. This process is done in such a way that both the files are first sorted and then joined using a join keyword. Consider two files you want to join. Now use the below-cited query to apply the given concept: $ join <(sort sample2.txt) <(sort sample3.txt) You can see from the output that the data both files are combined in sorted form. Compare Files Using Sort We can also adopt the concept of comparing two files. The technique is the same as it was for joining. Firstly two files are sorted and then the data in them are compared. Consider the same two files as discussed in the previous example. Sample2.txt and sample3.txt: $ comm <(sort sample2.txt) <(sort sample3.txt) The data is sorted and arranged alternatively. The initial line of the file sample2.txt is written next to the first line of the file sample3.txt. Conclusion In this article, we have talked about the basic functionality and options of the sort command. Linux sort command is very beneficial in the maintenance of data and filtering all useless items from the files. View the full article
  25. We are excited to let you know that we have released a new experimental tool. We would love to get your feedback on it. Today we have released an experimental Docker Hub CLI tool, the hub-tool. The new Hub CLI tool lets you explore, inspect and manage your content on Docker Hub as well as work with your teams and manage your account. The new tool is available as of today for Docker Desktop for Mac and Windows users and we will be releasing this for Linux in early 2021. The hub-tool is designed to map as closely to the top level features we know people are using in Docker Hub and provide a new way for people to start interacting with and managing their content. Let’s start by taking a look at the top level options we have. What you can do We can see that we have the ability to jump into your account, your content, your orgs and your personal access tokens. From here I can dive into one of my repos And from here I can then decide to list the tags in one of those repos. This also now lets me see when these images were last pulled Changing focus, I can go over and look at some of the teams I am a member of to see what permissions people have Or I can have a look at my access tokens Why a standalone tool? I also wanted to mention why we have decided to do this as a standalone tool rather than a Docker command with something like docker registry. We know that Docker Hub has some unique features and we wanted to bring these out as part of this tool and get feedback on whether this is something that would be valuable to add (or which bits of this we should add!) to the Docker CLI in the future. Given that some of these are unique to Hub, that we wanted feedback before adding more top level commands into the Docker CLI and that we wanted to do something quick, we decided to go with a stand alone tool. This does mean that this tool is going to be an experiment so we do expect it to go away sometime in 2021. We plan to use the lessons we learn here to make something awesome as part of the Docker CLI. Give us feedback! If you have feedback or want to see this move into the existing Docker CLI, please let us know on the roadmap item. To get started trying out the tool, sign up for a Hub account and start using the tool in the Edge version of Docker Desktop. The post Docker Hub Experimental CLI tool appeared first on Docker Blog. View the full article
  • Forum Statistics

    63.6k
    Total Topics
    61.7k
    Total Posts
×
×
  • Create New...