Kofax RPA has allowed us to create a program in which the top 100 forms used across the organization are extracted and pushed into a database, from which any team globally can extract the information from those forms.
Robotic Process Automation (RPA) Database Reviews
Showing reviews of the top ranking products in Robotic Process Automation (RPA), containing the term Database
Kofax RPA: Database
Automation Anywhere (AA): Database
In V10 and V11, most of the Windows and web-based commands are helpful. We mostly use the Database commands in order to make the process faster and more secure, rather than using Excel or a CSV for reading the input files.
My primary use case starts with downloading an inventory file in IE and then storing it in the database. This uses DB variables that interact with the banking application.
Through its automating system of generating quotations, it is much easier for us to track back the same customer data. It is giving us a real-time capability for generating reports.
Automation anywhere has equipped us with amazing tools including Bot Insight, IQ Bot, Discovery Bot, etc. The Discovery Bot is used to search the entire task in the system that can be automated. It keeps much value in our organization.
Extracting PDFs is very easy using Automation Anywhere. Its strong points are that it is easy to use, easy to deploy, easy to share, and has multi-user support. Different devices can be used to execute the bot.
The Bot Store provides some utilities for free, which can then be integrated into my workflows easily.
The licensing model and cost are very impressive. Now, you do not have to worry about infra setup cost like Server, Database as it comes free of cost with the A2019 cloud version.
I have used AA to automate various business processes varying from HR and AR teams to Sales.
Starting with HR, we have automated various processes from auto-downloading the data and uploading it to a database without human intervention, to combining multiple file data into one.
For the AR team, we have automated the process to download data from the server and generate the AR report, then updating the dashboards for senior management.
For Sales, we created a form where the Bid Manager would update pursuit details and the tool would pull that data and then send the request to required teams for SPOC allocation.
Other than the CR set up with the database will take little time. Client installation and configuration was straightforward.
Our primary use case is for automating daily routine jobs, where I have saved a lot of time. I have enhanced my automation knowledge. We have automated the daily reports and it will trigger automatically once jobs are completed from the backend database which has saved a lot of extra work for our team, saving workforce and time.
The implementation of the bot is easy and can be built in less time when compared to the other automation tools. No coding knowledge is required to build the Automation bot. It's very as simple, easy to use, and maintain.
Our primary use case is for building Solutions involving RPA bots. We have implemented two standalone infrastructures each with A2019.16 Control rooms.
Multiple Developers are working using the tools and commands in A2019 to create bots that will be deployed for clients to address their business requirements.
The infrastructure is running on the private cloud .
The control room is running on the latest Windows server and the latest Windows SQL server is running the control room database.
Developers use VMware Virtual desktops to execute their bots. Chrome and IE browsers used by the developers.
We use this solution for all the business processes and mainly to integrate with Excel files, our database, and our ERP.
We basically process data that are in form of Excel, PDF, or CSV formatted files. We also interact with information that is stored in our database.
Generally, we automate a lot of processes that are repetitive and time-consuming. We work mainly with processing data, thus reducing the time needed to do the process manually. The workforce is also reduced for completing simple tasks like logging in to a forum, as well as complicated tasks like banking or finance. All of these all possible.
reviewer1681449 says in an Automation Anywhere (AA) review
Software Engineer at a tech services company with 501-1,000 employees
Setting up Automation Anywhere is a bit complex. When I entered into the RPA industry, Automation Anywhere had just been launched. Automation Anywhere was first released in 2014 or thereabouts, and I started working on it in 2017. I was just a novice when I began my journey with RPA, and it was a whole new concept in my organization as well. When we were setting up the control room server and the client, it was a bit daunting. We had to do so much in the background database and then the client. It is pretty complicated to set up the Automation Anywhere control room and the high availability clusters.
reviewer1180077 says in an Automation Anywhere (AA) review
Lead Engineer at a financial services firm with 10,001+ employees
Uchechi Sylvanus says in an Automation Anywhere (AA) review
Team Lead, Process Improvement at Fidelity Bank Plc
We are currently using Automation Anywhere for a Master Card customer complaint bot that will log tickets on our top priority websites. When customers complain it can be logged directly or the agent logs it on one of our databases. The bot goes to the database and can pick up the ticket that was logged on a top priority website. It then becomes a role ID number to us for verification that has been locked properly. Third parties can take notes, then once locked, they can give the ticket to us.
We did this procedure manually before the use of Automation Anywhere.
Setting up Automation Anywhere (AA) was not that hard, but I can't say that it was easy. I don't know if I did something wrong in the configuration when I first installed the tool. You have to configure the database yourself, and I believe that the tool should configure itself.
Blue Prism: Database
The user interface could be improved and I would like to see a web interface to easily access robot information. The key value of Blue Prism is also its main weakness because Blue Prism is quite connected to its data, so anything you do on Blue Prism is logged on the database. It's quite difficult to read these logs, so I would like to have a more valuable way of understanding how robots work on the environment. Blue Prism doesn't manage variables well. It's annoying because you have to declare visually every variable but there's not a list of the variables. You have to open the graphical elements and check the variables.
The solution is not user-friendly. It has a very high learning curve. People should be able to learn it easily so that they will get interested in using it.
While the solution is more secure, it's very hard to find people trained on that. I need different people, not only those who are trained on RPA tools. I cannot get people in Blue Prism. Without the resources, people just move to Automation Anywhere or UiPath which are more user-friendly. In comparison, UiPath is much easier to use and you can find people who are well-versed in it.
The product needs to put out more videos, similar to Automation Anywhere, which does that a lot. You can find a lot of videos online in relation to Automation Anywhere and UiPath, however, this is not the case with Blue Prism. There's just less information available.
Blue Prism needs to provide better training. They need to start something similar to Automation Anywhere University or UiPath Academy. If they had some courses at different levels (basic, advanced, and master), there would be more educated personnel available.
The solution needs to provide a trial license - whether it is on cloud or on-premises. They need to provide a standard environment to work with. We need to have practice in installing the data center and connecting it to the database.
We need to understand how we can migrate from one lesser version to the higher version and what is the load balance and how we can manage that. We need to understand better how our core system is managing that. Proper training would help with understanding.
reviewer1527024 says in a Blue Prism review
RPA Developer at a tech services company with 10,001+ employees
It is not hard to install. You just need to design the database.
Deployment is pretty fast. For deployment within our team, the server is the same, and we just configure it to run on our machine. For deployment outside our team, we just need to install the server and then configure the machines. It is not that complicated.
reviewer1625121 says in a Blue Prism review
Tech Manager at a computer software company with 5,001-10,000 employees
The initial setup is fairly okay in comparison. With Blue Prism, I could configure all of the types of configuration that Blue Prism supports. Whether it is single sign-on, app server-based, database, or single login, it was all quite simple.
One of the most powerful features in Blue Prism is exception handling. It's one of the features that really differentiates it from other platforms.
It is really easy to use; it's user-friendly.
Also, you only have one database and can consolidate everything, database per environment. I think that helps to have more control, to trace all the data and all the loss, and scale in an easier way.
It is stable and easy to scale.
It is easy to set up as well.
One of our use cases is for our insurance team where we have built a prototype which has helped the insurance team cut pricing. That is one of the automations that has made a difference.
Also, the fact that this is a SaaS solution means we are able to innovate much faster when it comes to automation. We have been able to complete use cases in as little as one month. There can be problems with the on-premises version because there are certain restrictions for accessing that database. But because this is on the cloud, we can access the data from anywhere in the world. That is very beneficial.
UiPath has also helped to reduce the amount of maintenance work related to our automation operations. It has a feature to keep the different automatic tasks monitored. That monitoring helps us if there's any problem in an automation. We get notification that there is a problem. If we don't know about those problems then, after some time, maintenance on them will be harder. It does help us to maintain things.
Before, we were using the on-premises service for our automation activity. By using UiPath Automation Cloud, we don't need to go anywhere. There is a single tool where we can keep track of the maintenance. Doing so with the on-prem solution was much more time consuming and slow. With UiPath it's so much faster so it helps us cut costs.
And in terms of overall cost savings, before we implemented one of our automations there were around 100 people taking care of the task. Using UiPath, we have automated that task and we have saved the fees of 100 people.
Migrating from our on-premises solution to the cloud was not a typical case because we lost our on-premises deployment during the cyberattack. We had at least a few months without the Orchestration solution. When it comes to execution runtimes, where we run our processes, we used the same machines.
Basically, we had to set things up from scratch on the cloud. The process was pretty straightforward, and the fact that we didn't have to set up the Orchestration tool saved us from a lot of the complexity in the setup process. Normally, this is the complex part, including setting it up with the databases. We just had to connect our runtime with the Orchestration platform, which made it much easier.
With respect to the setup costs, the cloud setup balanced out because you don't pay for the orchestration platform, but you pay a little more for the individual licenses.
We have a use case that involves an invoice billing process, where vendors from an external organization submit their details for the invoice. This automation works as expected, independently of anything else. It is also a good example of how we were able to scale RPA benefits in the company with the automation of a specific process that requires human-robot collaboration.
Our internal tools include the database where all of this information is stored, and we have a second automation that is used by the billers in our organization to tally the data that includes details such as what each vendor has submitted to get their payments.
We built a third automation in UiPath, which basically compares these first two. But, due to the complexity and the nature of the tally that has to occur, we require some human input in between certain steps.
For these particular steps, we have developed a four-bot configuration. These are four separate bots that run and a couple of them have an attended automation part, where a human can intervene. It's a verification step, where the human can decide whether or not something is okay. Specifically, the bot compares two fields and if they match, then it's great, but if not, it triggers a request to a human user for manual verification. If they approve then it is marked as a successful verification.
Because we use technologies like OCR, there are details that cannot always be interpreted properly. This is where we need an additional check, which is the reason that we have humans in the loop as part of the process.
reviewer1509951 says in an UiPath review
Associate Consultant at a computer software company with 10,001+ employees
We have separate, dedicated test data in three different environments. Orchestrator has a database and email server, so everything is in Orchestrator. Apart from the servers, products, and services, everything has a separate operations team, which has eight to 10 members, who take care of everything.
The best feature in UiPath is their robotic enterprise framework because that is an inbuilt processing framework for utilizing their work queues. It's plug-and-play, and already pre-built to where you don't have to start from scratch. It's enterprise-grade and ready to be used. All you need to do is populate your dispatcher, create a queue, create a performer, and you're good to go.
The highest benefit of it is that it's just there, ready to use, and you don't need to start from a blank screen. You don't have to figure out, for example, how to create an environment where the robots can check if there's anything in the queue to be worked on. The framework is already there. The other tools that I've used, like Blue Prism, don't have that built-in quite as well.
My perspective and overview are from that of a developer, and I find that the recorder feature is really good. This is because UiPath lets you record your actions on the screen. So, if you want to interact with a web-based interface, for example, then you have UiPath record your actions and then build the activities that you would need in order to replicate those actions through the robot. It makes it a lot better and although it's not perfect and it does need to be reviewed and adjusted, it speeds up development quite a bit. This is especially true when it's basic back development like populating fields and clicking buttons and navigating on a web.
Compared to other RPA tools that I have used, something that stands out to me in UiPath is that it has a very extensive library of activities. Those activities are easy to search for and use.
When you are writing code, there is a feature called IntelliSense, which autocompletes your code. More specifically, when you're typing code, if you're starting to type the name of a variable, it will show you all of the variables available and you can just click them. It's very interactive and it's reminiscent of the Microsoft Visual Studio environment, both from the UI perspective and the coding perspective. This means that developers that are familiar with Visual Studio will probably feel right at home using UiPath. It's very developer-friendly and it's geared towards appealing to existing developers.
The UiPath Academy courses definitely help in the process of bringing employees up to speed. The Academy is the go-to place for UiPath learning and I think that other RPA tools are copying this model of disseminating knowledge, being a lot more open with training, making it freely available, and providing an online classroom. These are things that UiPath has always done, and it certainly helps new developers get upskilled in RPA, and specifically with UiPath.
When it comes to ease of use, UiPath is intuitive insofar as the basic features have a low learning curve. However, if you want to take full advantage of what UiPath can do, and if organizations want to create more sophisticated automation solutions, it is more difficult. For instance, automations involving back-end access, maybe writing directly to databases such as SQL or using API, that's a steep learning curve. In fact, I think the learning curve is exponential.
If you just want to make a robot that sends an email, that's really easy to do. But, if you really want tangible benefits, like if you really want something that solves a business problem, it is a huge learning curve and it takes a while to master. Obviously, it does have that low-code requirement, but I would say that's only for entry automation projects, like proof-of-concept or something along those lines. For something that really solves a business problem, you would need code, because that just makes it a lot more robust and a lot more powerful if you can custom-code certain steps of the process.
reviewer1510449 says in an UiPath review
RPA Developer at a maritime company with 1,001-5,000 employees
We just haven't scaled to a point yet where there has been any kind of return on investment.
There are not very many users because the stuff that we have automated so far has just taken work off people's hands. Where a person used to spend all day uploading pricing data into a database, we have a bot doing that now. So, people are not using UiPath, they have just sort of been relieved of their duties. While that sounds bad, we have made an effort to find areas where FTEs get to spend time doing what they are best at.
Stability is something that we should consider in two parts. The first concerns the bots and how they are running the tasks on the machines. This comes down to what kind of developers we have because if you are developing properly, and implementing all of the exceptional cases that may occur during the execution of the process, it's very good. I haven't had any issues in cases like this.
The second part is the Orchestrator, and I haven't had issues with this either. In the more than three years that we have been using this environment, including the time in production and our test environments, we have never had an issue.
We have had two or three incidents because we didn't have enough space left on the database storage, but that was not related to UiPath. Rather, it is related to the infrastructure. Another time, the SSL certification expired so we had to renew it. Otherwise, stability-wise, we haven't had any problems.
We have licenses for 15 robots and they are running internal processes. We develop them using UiPath Studio and we only use unattended automation.
Our primary use case, which is 70% of what we have automated, is related to our booking system. Instead of having 10 agents who handle the booking or creating the reservations, the work is done by the robots. Sometimes, bookings are very simple where you have just airfare or the hotel, but in our case, it's quite complex. We call it dynamic packaging, which will have a flight component, you can have a hotel component, different attractions, meal options, a rental car, and more. Instead of entering all of the options manually, which can take up to 10 minutes or 15 minutes just to create a single booking. It is similar when we perform other tasks, such as making a payment. These things are normally done in our target system. I have created robots and workflows in UiPath that are triggered by the database, and they complete these tasks automatically. We have 15 robots conducting the job.
The second use case replaces the agent when once we get all information from the outside system using a .NET application and store it in the database, it creates parameters for the robots to make a booking or reservation at Universal Studios for attractions.
The third use case covers all of Europe and it is a completely automated car booking system.
Basically, our use cases are all about travel and booking systems for Universal Studios, general dynamic packaging, and car rentals.
The most valuable feature that we are using is UiPath Apps because it makes it very easy to implement tasks. It is very easy to scale operations, which is important because we're not talking about just five or ten agents. We're talking about 1,000 to 2,000 agents. The Apps feature helps us to scale very quickly and very easily. We only need to develop one or two bots and then link them to UiPath Apps to process everything. All of the integration between the bots and the human, along with any scheduling that needs to be done, is taken care of by Apps. In our situation, the Apps feature is the best solution to handle this scale.
Utilizing our bots is very easy, and it is done using the licenses that we have with partner UiPath. We can access our licenses, then distribute them to the customers and we can use them dynamically. This is all done in a very easy manner. We just have to navigate to the web-based hub, where we have access to everything that we need.
UiPath is highly customizable and this is helpful for us because we can develop models and frameworks that can be reused for different tasks and different customers. For example, if we have a customer with a process that is very similar to one that we have previously developed for somebody else, we can reuse the models to scale the bots. This makes the new development very easy and very fast.
The Agent Console is able to provide customer insight in conjunction with the task and process mining features that we use. We install the tool into the machine that the customer uses every day, where it will capture the manual tasks and processes into a database. The insights that we receive are related to whether a process is a good candidate for RPA. For example, if it takes the human a lot of time to complete, or they are having trouble with it, then it might be suitable for RPA because putting a bot in place can optimize performance.
Another reason this is important is that human operators work very hard with day-to-day tasks, and they don't have much time to stop and look for processes that can be automated. Using task and process mining, it starts pulling out those insights. For example, it looks for the number of screens that the human is accessing and clicking on. It looks at each click, as well as every navigation and extraction. In the end, it generates a report for us.
The Agent Console has helped to decrease the average agent handling time, which is our main goal when it comes to these massive business operations. Average agent handling time is the metric that we primarily work with and as such, everything we do is related to reducing it. RPA in our use case is not used only to reduce the HC or FTEs; but it is used to boost this particular KPI too. In one of our use cases, we have had an average decrease of 30% in agent handling time, which is very considerable.
We provide RPA services and I am currently working on two different projects.
These projects are for two different clients that are each using a different version of the platform. In both cases, it is an on-premises deployment. Our clients only use the end product and don't do any development themselves.
One of my clients is a retail organization and the primary use case is invoice automation. Previously, the process was totally manual. They have different products and different departments and for each and every department for which they bill, like HR, there are printing and supply chain tasks to be completed. As part of their process, they generate invoices monthly.
To generate invoices, they need to gather data from different sources, such as a database or Excel files. What we have done is fully automated the process. They now only need to work with a consolidated Excel sheet and then email it, once complete.
Once they send the email to a particular email address, the robot retrieves it and reads the attached Excel sheet. After doing some cleaning, consolidation, and validation, it generates invoices each month in a particular template, and then it submits them to the EBS portal.
The manual invoicing task used to take between two and two and a half weeks. Now, they start it at 4:30 when they leave and it works overnight. The process is now fully completed within two days. The time saved is now time that can be used to focus on higher-value work. It has also improved employee satisfaction.
My advice for anyone who is implementing UiPath is to always check the documentation before you try to look for answers on the forum. Another good point is that when you have a problem, there are plenty of people in the UiPath community that can help you in a few minutes. This is the perfect solution, in this case.
From the maintenance side, you have to remember to increase your database with the scaling up of the automation because it can really slow down your process.
The biggest lesson that I have learned from using UiPath is to always create a backup copy of Orchestrator before you update it. This was a very big lesson for us because we had an issue with the installation. It is also really important to back up the related databases.
I would rate this solution an eight out of ten.
The initial setup is pretty straightforward. I'm not a system admin or anything like that and I was able to set up UiPath on the server. It's pretty good.
How long it takes depends on the database that I'm working on. That said, last time it was not even the one full working day. It depends on how much data you have to back up. Usually, it's a few hours.
I am an RPA developer and I work with UiPath in that capacity.
Our current use case involves the automation of a process involving healthcare-related data. This is confidential data that is received from the customer and inserted into Oracle forms. Reports are then generated from it and these reports are then used by the organization, which is in the healthcare domain, for their analysis.
The data being analyzed includes medical and treatment history. For example, with the current pandemic going on, there are all sorts of healthcare data that is related to it, including various types of treatments. When somebody walks into any clinic or hospital, all of the treatment is entered into a database and we get an extract of it. The analysis is used to get more details.
Another interesting use case, prior to this one, involved the documenting of invoices. We were working with approximately 250 different samples of purchase invoices, many in different formats. One might be a native PDF file, whereas another could be a scanned PDF, and yet another might be a simple handwritten invoice that was converted into a PDF based on a picture that was taken from a mobile device. We were receiving these invoices from our client and they wanted to extract data from them. It was accomplished by using the Document Understanding features in UiPath.
The other notable use case had to do with issuing refunds for purchases that were made on an e-commerce site. When a customer made an order and there was a problem that resulted in them wanting a refund, there were multiple ways that the client could request one. A refund application could be received by the customer care department in the form of a simple call, which was a verbal request, or as an email written by the customer, or as an automatically-generated email that was created based on filling out a form on the website.
Regardless of which of the three input methods is used, the refund request is gathered and sent to a mainframe application. At that point, the information is extracted from the mainframe and the refund is issued using another application.
The automation of these tasks using features such as artificial intelligence and document understanding has reduced our costs. For example, with the invoice processing use case, there was a team of between 20 and 25 agents who were doing it manually. Obviously, a team of that size has a large cost associated with it. Also, the volume was very high, which meant that the team was not able to deliver on all of the work. There are approximately 250 vendors sending invoices to our customer to process the data, which translates to about 1,000 documents being sent on a daily basis, to be processed by only 25 people. It was a huge task. With this level of volume, people tend to get frustrated.
We implemented the automation and the team size has now been reduced to only five or six people, and that is only required to monitor the bots. For example, they check to make sure that the data being fetched using document understanding is at par. We have set the minimum confidence of the documents being scanned at 90% and each day, a report is generated and sent to the customer. Overall, it was a very cost-saving implementation.
I have worked on a number of use cases, and one of them that I can discuss was used in a contact center environment. This is a project that we had done for an automotive insurance company, and it had to do with incident management. Our contact center received the first notice of loss (FNOL) from incidents, such as an accident.
When an accident occurs, they raise a ticket to our customer service representative. This can either be done using a chatbot, which is integrated with our ServiceNow platform, or they can call the customer service representative. In the latter case, the customer service representative will pick up the call and get the details. This includes adding their insurance ID and a couple of other fields, and that is integrated into our system.
Our system was acting as an intermediate between their existing platform and ServiceNow. Part of the system included a database, where they were checking to see if the insurance amount the claimant is asking for is above the limit. There were other similar business rules, as well, which the bot was responsible for checking. Based on the result of these checks, the claim was automatically approved, and then a corresponding ticket was raised in ServiceNow.
There was also a manual process, where there was a person who would go to the site where the actual accident took place. They do their analysis, and then they create a review report, and that report would automatically be handled by an attended robot. The robot would take the detail from the agent and based on the review, fetch certain details like the approved amount.
The bot is responsible for sending other information to ServiceNow, including, for example, details about damage to the vehicle. If there are scratches on the front or scratches on the back, then these details are all posted to ServiceNow. At that point, ServiceNow has a workflow that is initiated.
The workflow uses the information taken by the representative and moves from the review stage to agent verification, and then to a mainframe. The system running on the mainframe is responsible for generating checks, according to the amount that is approved, and then mailing them to the claimant at the address they have on file.
UiPath makes it very easy to develop automations. The interface is user-friendly and makes it easy to perform operations or use services, whether it is a database or another product. We can perform tasks on Microsoft Azure, for example. Many operations can be completed using inbuilt packages.
For whatever activity we want to perform, it only involves using the drag-and-drop capability, so it is easy to do. Anybody can do it. No programming-specific knowledge, like .NET, is required.
It is easy to develop custom components, which makes life easier.
UiPath allows us to implement end-to-end automation starting with the process analysis and ending with the monitoring. This is important to us because for any new process that we identify, using the task capture methods helps us to gather the documents that are required to automate it. After we develop the automation in Studio, we can easily monitor it using Orchestrator. It is helpful to have a complete solution from start to end, with all of the features that it has.
Using automation means that we increase our process output with minimal effort, which is something that every company wants to do because there is a saving in terms of manpower. It is definitely helpful in our organization.
The amount of time or cost savings depends on the process. For example, some processes that take four or five people to complete can be done using a single bot. Also, people can only work six or seven hours a day, whereas, with automation, the bot can run 24 hours a day. Not only is the process done more quickly but at less cost.
Attended automation has helped to scale RPA benefits because we have some scenarios where human collaboration is required. These are business-critical processes, so any level of automation is important for us.
In addition to savings in time and cost, UiPath further saves us money because of the reduction in human error. When a human is performing a task, mistakes happen. When the bots are used, there are no errors and when the number of mistakes is reduced, the business has more income.
UiPath has helped to speed up digital transformation, although hosting it requires IT support. For example, if UiPath needs to be updated or our infrastructure needs to be expanded, then it requires the help of IT support.
The most valuable features are some of the panels in UiPath Studio. For example, there is a debugging panel and a Designer panel. The debugging panel is useful because without it we could not solve any problems. The debugging panel provides functionality such as Step Into and Step Out, and we have highlight buttons. It helps us to analyze our code, what is wrong in a solution, and debug from the start to the end, to make the solution better.
The Designer panel is where we create a workflow or step-by-step process, the place where a developer develops the code.
Within UiPath Automation Cloud, we are using Orchestrator in which we can
- deploy the bots and maintain services
- create attended and unattended robots for different versions of machines and manage which robot runs in a particular environment
- use the queue to manually configure the times that bots repeatedly run. Using Orchestrator, we can simply schedule the target application. The queue also has a retry mechanism so that it will automatically take input, and we can specify the number of retries
- store a user's ID and password credentials in the Orchestrator database
- check the Orchestrator home page for what processes and jobs are running, and see any feedback on them, as well as the output
- see the logs in Orchestrator.
I am an API developer and I use UiPath for development. I use it to develop solutions for banking problems, like banking automation.
For example, in my previous company, I used the API for developing automated reporting solutions that take a lot of Excel files, check their data, and try to generate a web page containing many graphics based on the Excel data. It's basically translating the data on the web and it's made automatically every month.
For my current company, it's a banking company, and I'm working on the banking solution. It's a process of verification of the user identity or the client's details. This process is based on taking the ID card of the person and digitalizing the data. It's a technology meant for reading data from documents. After reading this data, we automatically take this data and put it into the database and create accounts for the user or do a lot of automated things.
At my current company, the use case is for the process of managing the relationship between the client account and any fees. A robot always checks if there is something to pay for the client and can take the fee automatically if that is the case. Then there is a transfer of money based on the request.
For example, when someone wants to do a transfer they add the money and sign a paper. This paper contains the information of the client's account, including details such as the client name, the account number, and the amount of the transfer. We take the data and the robot automatically takes the data and, via the web, goes to the apps of the bank in order to search for the client, search for the account, say the amount, and take the proper amount from his account, et cetera. We're able to save steps as everything is automated.
In the case of one of the clients I've worked with, they're working on a process where they need to provide students with a student visa pass. It's within Singapore and every student that has joined this institute needs to apply over a website. 5000 applications are received every year. These applications need to be manually added to the government website.
We automated this process, starting from the beginning to the end. There's a lot of interaction required. The team worked on an Excel sheet. In fact, a number of people work on these Excel sheets. With many people, there's always a chance of misleading data, as I might at one point be doing some more revision on the sheet, and someone at the other point might be doing more revisions. There is the chance that data will clash. In order to make sure that this won't happen, we came up with a SharePoint list where we could add the data, and if anyone changes anything, there's a simple and clear record of who made the changes, and what the change was.
At the same time, the bot can work on the SharePoint list as well - and there is no chance of a clash occurring. We can create a process and a number of steps that involve reading the data and extracting data from an application while swapping or extracting data between two forms.
There's a lot of swapping. We extracted the data via the backend, via the database, and directly put that into the IC application. The processing time for this application previously was somewhere around 20 minutes. Per record now, the time has been reduced to three minutes. Previously, there were 18 people working on any particular application. Right now, there are only two bots working on this website, and they are doing work like magic.
I am the solution architect who setups.
I was working on the 2018 version of UiPath. The 2018 and 2019 versions are very easy and very straightforward. There were not many changes or many complications in order to set up or upgrade. However, when it comes to 2020, from 2020 onwards it's very complicated.
Now there is an IAS. There is no connection string update. We cannot update any connection strings, and yet we could in the 2019 version. From 2020 we're not able to do the changes at all unless we go further and do another upgrade or something like that.
Earlier it was straightforward. Maybe there was a little bit of conflict, fine, however, now that it's split into multiple things with a conflict DLL file, orchestrated DLL file, identity server file, then an app setting the adjacent file. That is gathered completely into all of these things, where until and unless you have both end-to-end documentation understanding, you cannot go ahead and do anything.
On top of that, there is the SSL certificate. Until 2019 we didn't require each and every robot or a development machine to have the same SSL certificate. Now, we have to export and import to all the machines and add the user's perspective.
From the licensing perspective, licenses were straight, and there was no migration required for the license to be utilized in any of the versions. From 2020, there is a license migration required from the UiPath end. We now need to contact UiPath for that in order to get this migration done.
All of these changes, as well as the identity server database creation, everything has a kind of impact on the ease of deployment.
Upgrading doesn't take much time, however, users deploying the solution should have a ton of knowledge about each one of the steps. They need to remember everything in order to perform the upgrade or else something might be missed. Even if you miss one step you will have to spend hours and hours in order to rectify that.
For the 2020 version, for the initial deployment, I did not actually do it from scratch. I just upgraded. That said, if a user wanted to do it, I would estimate it takes more than a day to complete.
The implementation strategy depends upon the requirements of the client. For example, if it is on-premises versus if it is on cloud and/or if the client is looking for Elasticsearch or Insights or test automation, et cetera. All of these things will be dependent on the other. If you ask for Insights, you need to have an extra server setup for that. The same thing follows with the test automation and SQL database. What we call roles and responsibilities also will be dependent.
reviewer1642377 says in an UiPath review
Senior RPA Developer at a marketing services firm with 10,001+ employees
The initial setup was straightforward. The installation of the Studio was quite straightforward. We just had to go through all the legal terms and everything. Once we went through those, we just had to install it. The same thing is true for Orchestrator as the on-prem installation of Orchestrator is pretty straightforward. You just have to get the setup, link it with the skilled server, and then install it.
Apart from that, the configuration within Orchestrator was very simple as there is only one file that allows us to log on to everything. It made it pretty obvious.
The deployment took somewhere around two days for the entire setup.
In terms of the implementation strategy, firstly, we decided to set up all the databases and all the dashboard-related services such as Power BI. We decided to do this first due to the fact that the dashboards and databases are the base of any application.
We decided to implement it first in Azure. On the same day, we decided to get the cloud version of the Orchestrator as well. It was quite easy in terms of Azure. There's a three-way plugin that is available there. We just had to install that on the specific VM and we were done. Finally, on the second day, we went ahead and installed all of the Studio. Once Orchestrator is up, we could install Studios and link them to Orchestrator in order to get the license. That was our strategy and our approach.
We essentially have one dedicated resource for maintaining all the deployments and to watch if anything goes wrong. We have three dedicated resources for maintaining all the bots that are currently running as well. We don't need a big team to maintain everything.
reviewer1642950 says in an UiPath review
Application Development Specialist at a tech services company with 10,001+ employees
I would like to see more AI-related features added. Improvements could be made to the models so that they are more compatible with data science and machine learning.
Better support for databases should be included. For example, interacting with SQL Server and SQL Developer would be beneficial features.
The initial setup started off as straightforward, however, it can easily get complex. For example, not knowing the other applications and what it takes to interact with them will make things a bit more complicated. Also, the APIs versus actually scripting information, especially when you're talking databases or backend reports, et cetera, gets complicated.
Our deployment took three months.
The CTO and development team were involved in the process of implementation.
I'm not sure which version of the solution we're using.
I have not used the AI functionality in my automation program yet.
In terms of functionality, UiPath does quite well. The fact that it's not net-based gives it a lot of flexibility to use it in a non-standard way. For example, we were building a robot interface with an Oracle database, and we could either build a robot to open up an access database with blank tables and pull it down. Or we, due to the fact that it did not have access to the net, could grab an in-memory data set and cycle through that. I liked that flexibility and the fact that we had options.
I'd rate the solution at a nine out of ten.
Our primary use case is to document image processing. We're six months in, so our first case was sorting and filtering the data, extracting the image, and determining if it's a certain type of document. If it is, it starts putting it into different buckets, which ultimately we'll run something to extract and put those into our data source.
Our second use case is for the healthcare industry. We're looking at catalog data and a customer might want to know about a product. Is this product safe? Who provides this product? Is it on a contract somewhere? We go out to multiple different web sources to look up information about that document, put it back in our database, save it for that customer, then save it for any future customer that asks the same question.
We're looking at other things like taking snapshots of the image of the product. We also want to automate other basic automation, low-hanging fruit type functions, like automating uploads of data to sites, spreadsheets, contact-center, and Salesforce.
Longer-term, we want to take what we're doing in the document image and apply it to other areas of our business. We have purchase orders, invoicing, shipping documents, compliance documents, credential documents, a lot of images in this particular space. We'll go as deep as we can in the data processing side of things.
We actually had to spend about four months of maintenance to make sure that we got the solution to how we wanted it. We brought in a contracting firm and they didn't know the company and they just kind of said, "here's what bots can do."
What we did is we did an assessment program for two months. During those two months, we looked at what they built, which was great. This got us up and running and showed us what's possible.
Then, we took those two months to identify, for example, if the database maybe should have been set up a little better to interact with our other databases. Or if the coding should have had different paths of risk that they didn't know about. If you don't know the business, you don't know the risks, and therefore, you don't know how to set it up. That's why we did all of that assessment and then we spent four months fixing it to adjust to what we thought was a better path or a more stable path in order to support the robots.
Our primary use case is to scrape our database to get data out, create audit files for our tax team, and then take that data and go to websites for each state and submit our taxes online.
reviewer1695615 says in an UiPath review
Associate - Robotic Process Automation at a financial services firm with 10,001+ employees
We primarily use the solution for operations processes in our corporate investment bank. For example, screen scraping, querying from databases, or any transactional processes. Those are what we're really looking at the most.
The database connectors I found are not fully free and expire after 30 days. That is something I would like looked at for MongoDB specifically. With regards to this, I was working on a project that needed a robot to read data from a MongoDB database. To achieve this, I used the CDATA ODBC driver because I couldn't find a direct Activity to achieve this from Studio. See the link to the CDATA ODBC driver for UiPath here https://www.cdata.com/kb/tech/...
It would also be great to have UiPath Insights included in the free Orchestrator. The Insights module is currently only available for paid licenses. It would be great for developers to have it included in the free version because then we could try it out.
UiPath makes it easy to develop automations and this is the main selling point. I can speak with a client and in the meantime, I can prepare a demo on the fly that captures the client's thoughts at the moment. What it means is that as I'm speaking with you, I can start preparing a small demo. I find the product fun to work with.
An example of how this has improved our business is when dealing with internal clients. For example, if an internal business manager wants to use BI and needs to create a report with a specific set of data, they traditionally had to reach out to the IT department. IT will first examine the needs, then discuss how it is developed. It may need a database instance or other tools, for example. Traditionally, this is how it is done.
One of the problems with this approach is that our headquarters is in France, and they are used to having internal discussions about everything. For a use case like this, they will consider all of the needs and other points before making a decision. It can be very time-consuming.
However, if we consider the same use case, using UiPath, we are able to create reports on the fly. We can be right in the same meetings with the IT people when we do it.
If you're from a legal department and your solutions involve HR, as well as other company departments, I can automate several processes in four hours. Then, all of the processes can run during the night. It is an amazing product in this regard.
As we automate processes, another benefit that we receive is the ability to generate internal reports comparing departments and processes. We give these reports to the heads of the company to provide intelligence, helping them to better understand the organization.
As an example of somewhere that UiPath has saved money, I implemented automation to replace a tool that one of our clients has. It is an internal timesheet tool and although the company uses SAP and SAP HANA for these tasks, this tool handles aspects that are specific to Spain. It is a small tool but is needed for a particular purpose.
The initial development of the tool, handled by an external third party, cost €20,000 (approx $22,500 USD) and there is a monthly maintenance fee of €700 (approx $790 USD). We discussed replacing the tool with our client but they were hesitant to change because they already had the solution.
We offered to replace their tool for free because we are trying to internalize processes, so there was also a benefit for us. We explained that once it was completed, we would be responsible for performing the calculations and analysis to ensure that the replacement was working properly. They agreed and it took me only one day to complete the automation. Now, it takes only a single button click from beginning to end. At the end of the day, it brings in all of the jobs. This automation saves them €700 per month in maintenance costs and it would have saved the initial development and deployment fee had it been implemented using UiPath from the beginning.
It was very easy to see that they were wasting money, and this is happening in a lot of places. We proposed to them that for these tasks, we would charge €600 (approx $675 USD) per day as consultants, and then for maintenance, we would bill them a monthly fee equivalent to 16% of the cost of the robot. For the bot used to replace their tool, it took me one day to develop and two days to plan and design it. The initial cost would have been €1,800 (approx $2,000 USD) and the monthly maintenance fee €200. They switched from their tool to the robot, since it was only costing €200 instead of €700 per month.
After they switched, they realized the power of automation and have since asked us about automating more of their internal processes. They have presented a storm of ideas, and the potential for savings is amazing.
You cannot compare whatever you do with a robot to a traditional software tool, package, or service. This example of the tool that we replaced is only one use case, and there are others but they are all more complex. Overall, it saves a lot in terms of time and cost.
Some of our use cases for UiPath range all the way from development to operational support through to business enablement. Our biggest focus internally is to enable a business to do what they do best. We generally provide solutions through the use of UiPath to cater for streams, e.g., Procure-to-Pay, Hire to Retire, and quote-to-cash.
We are using it to build solutions that can heal themselves. So, we make sure that our operational team is aware as soon as something fails with the processes that we have built. If one of the use cases or failures has already been listed, we note the fix and try to implement that. If that doesn't work, then we hand it off to a human to look at the task.
In terms of some of the use cases that we have in the business, we do quite a lot of ERP automation. So, we work with SAP quite a lot. We also have a lot of back-end data that we need to bring in and process as well. So, we use our SQL databases to perform tasks, e.g., allocating payments to bank accounts in our ERP system.
Because our development team is rather small, we try to create as many reusable components and solutions on the UiPath platform to make our day-to-day jobs a lot easier.
We are looking at UiPath's AI functionality. It is in the pipeline although there are additional costs for that subscription. Gradually, we have seen that there have been some issues with the automations, miscommunications or misguided invoices, coming into our databases. We need to capture and manipulate that kind of data with the help of UiPath's AI. We are hoping that within two or three months we will be able to introduce that.
reviewer1859118 says in an UiPath review
Senior Software Engineer at a tech vendor with 1,001-5,000 employees
One feature that I personally found valuable was the orchestrator. It is a pretty mature platform as of now, and it was three to four years back when I started to first use it. It has matured quite well. They had a major change a couple of years back. Our company transitioned from an older approach to a newer modern approach that they deployed. The orchestrator platform was very well-suited to the new approach - as was the development studio. It's really easy to use and intuitive. That has matured quite well as far as I can tell. These two are what I liked the most about the product.
UiPath’s ease of use and quick deployment times were great as the cloud orchestrator, which did not need much of a setup.
To build automation using UiPath is fairly simple. The studio is quite easy to use. Even now, with the community edition, it’s great. If we want to learn to start or try out something, we do not have to wait for licenses or anything else. That said, we can also get an enterprise trial. If we want to do something, learn something, even during our personal time, we can just download it. They also provide a free orchestrator version as well, so it becomes quite easy to learn and develop.
The building, deployment, and manual deployment processes, for small-scale projects, are very easy. If we need to build something, we just publish it, and it generates the NuGet package. It's very easy to deploy there.
The materials and the training courses are all pretty well-structured to get started with.
UiPath Academy courses have assisted in the process of getting our team up to speed. The basics were there even when I started out. I was not initially an RPA developer. I was into server operations before this. The UiPath Academy training really helped a lot with the initial courses, where they give you a tour of the platform and each and every activity. For audiences who are not much into software development, these courses can guide them towards that. The building blocks got us up to speed. They have very good courses there.
Regarding the Academy, it is a great learning platform for basic tasks. However, for more complex information, I turn to UiPath Forum. Sometimes I need some Python or C# scripts or am building custom libraries there. That gets shifted onto different platforms like Stack Overflow. We Google other platforms as well for the other types of queries.
UiPath Forum is a pretty good place in terms of the user community. Most of the queries that are posted generally get answered. Sometimes, even for smaller issues, we do not go directly into UiPath support and we first try to resolve the issues via what we find in the UiPath Community. Overall, it’s a pretty good place to solve our issues, and the community as of now is pretty active.
We saved time in our IT department since we started to use this solution. UiPath handles infrastructure for the orchestrator and its maintenance. There's a pretty good amount of time saved as we had initially had a server on-prem deployment as well. However, it became cumbersome to deploy multiple databases and they have some Elasticsearch requirements and security updates that need to be regularly maintained and in sync with UiPath. Due to this infrastructure overhead, our time could be consumed maintaining everything. Without them handling the infrastructure, we'd be maintaining instead of building automations and deploying them. We realized that an automation cloud would be a better option which is why we switched.
UiPath reduced human error. That said, we do not track errors in the process. It's a good metric to track as well, however, we currently do not track it.
It reduced employees’ time on certain tasks. The main purpose of automation is to save us the number of hours that the project will take. There are many other parameters, however, the time saved is one of the big ones.
We have plenty of processes that need to be automated, including IT processes like user onboarding. We need to create profiles across all systems for every user that joins the company.
There are also custom processes. For example, there is an insurance process when our company leases a vehicle and also a loan creation process. We need to automate internal hospitalization claims. UiPath handles all of the associated data and document automation. We use UiPath with Aviro for document automation.
We use UiPath's drag-and-drop APIs, but not much because we only have three APIs connected to UiPath. We aren't using AI or machine learning because the AI applications weren't available when we deployed and our current license doesn't include UiPath AIs.
In the future, we plan to use AI. Our current goal is to automate all the old departments. We've done about eight departments, and four more are in progress. After we complete this part, we'll implement the AIs. Until then, we're just working with the existing features.
We use the on-premises version of UiPath, with all the servers running through on-premises platforms. We have separate database and application platforms and a robot that runs through the VM, so we have a separate VM blade for that entire robot. We also have another data warehouse for that storing that data.
We currently run on 19.4.4, but we plan to upgrade to 21.10 in the near future. The latest version doesn't support Internet Explorer, but our co-banking system only runs on IE platforms. We need to rebuild our operations before switching to the new version. That will take some time because several large processes need to be automated. Until then, we will continue using 19.4.4.
UiPath is deployed on AWS EC2 instances, and we use it for basic automation. Very few developers are working on UiPath here. Our end-users don't deal with this. We only use our database and the data collected from UiPath, but the automation is not visible to our customers. We only use it in our internal development or design phase or to collect some automated data.
We haven't used UiPath's machine learning capabilities, but we are working on that. That could be useful in a few use cases for triggering some queues and running processes in the background. We are using asynchronous jobs, but we don't rely on AI yet.
One of the downsides to UiPath is the cost of the enterprise version. It is a little bit on the higher side.
UiPath's cloud offering is a centralized, all-in-one platform. It saves money because you don't have to invest as much in other software, and it's cheaper than some solutions because you don't have to maintain the platform or the database. However, because it's not cheap, the overall cost reduction is not drastic at first. Taking a holistic view, it does because overall, it will reduce costs.
There is also a Community Edition that can be used free of charge. This is an option for users that find the price to be high. One main differentiating factor with the Community Edition is the number of updates. There are fewer in the Community Edition.
Also, the support offered for the Community Edition is not as quick. People will not have a great user experience. However, it is important to remember that in terms of cost, the Enterprise Edition is a little bit pricey for small and medium-scale enterprises.
The initial setup was straightforward. We did it through the on-premises by connecting our SQL database, etc.
The deployment took around three hours.
Nitisha Mungara says in an UiPath review
Business Analyst at a healthcare company with 5,001-10,000 employees
For me, as a business analyst, the biggest difference between Automation Anywhere and UiPath is that UiPath gives me the ability to capture every step. Previously the way I would do it was to sit down with the business user and take notes such as, "Okay, he's clicking into this screen, going to this system," or I would have to record the process. But now it's very simple because I can turn on the recorder and have all the steps documented so I can refer back to them. I can then give that to my developer who can then use it as a benchmark when he's building his process.
The ability to get into the finest of details with UiPath is very helpful. You don't have to worry about those manual methods and can focus on the bigger things. These are some of the advantages of UiPath.
Also, the scalability across the systems that Automation Anywhere is compatible with is limited versus that of UiPath. With UiPath you can connect to different backend sources without restrictions. In the processes that I have worked on with UiPath, there hasn't been a huge complexity if I have to do something with APIs or with databases. Whereas, in Automation Anywhere, it was a little harder. It wasn't very flexible with all the different systems.
I have worked in the Power Platform with Microsoft. That is not specifically for RPA, per se. However, I've worked in it and have familiarity with it.
One of the biggest pros for sticking with Microsoft would be that we're a Microsoft shop, so it would keep all of our solutions in the same realm. We wouldn't have to export and import data across services and databases. It would just all be under the same umbrella. That's a huge benefit of sticking with Microsoft.
With UiPath, there's a stronger infrastructure that supports the development, maintenance, and scale of automation versus Microsoft. However, Microsoft is still pretty new and young to the RPA development environment, and they move very quickly. I would expect that they are not done developing their RPA suite either. I imagine that we'll see future iterations where they become a stronger contender to what UiPath provides.
The most valuable feature is that it can be integrated with almost anything. We use in-house applications and we're able to integrate them through the database using endpoints and APIs. This sharing of information between systems via UiPath means our staff gets results quicker. Normally, we have to put in a request with the database team to go in and input the data. But UiPath already has access and it's really quick. It's really responsive and makes the experience easier for the business.
That is also true for third-party vendors. The vendor that we order our phones from has a UI, a website that we use, and we have been able to use the UI to integrate their application.
We can implement almost any interface that we want, in any way possible. It's really flexible.
And given that healthcare is highly regulated, UiPath works really well for credentialing and that type of security. We see that it has protocols that ensure that our data is not going to be stolen. We use the credential assets to save our passwords and sensitive information such as licensing.
Our biggest use case right now is around business development. We use UiPath to run reports that we get off of the internet against our internal databases to try and find anyone who is a client because we're a law firm.
Most of my development right now has been on the business development side. We're helping clients with their cases, whether it's a litigation case, a bankruptcy case, or a real estate case. I use UiPath to help identify potential clients or current clients so that we can help them in their legal space.
My understanding is we are expected to hit the ROI that we're projected to hit next year. Even this year, with only three automations running for the entire year, we were able to cover our costs in that aspect essentially.
There are no more than five automations in full swing, and we would like somewhere between 15 and 20 next year. Likely, the ROI will continue to snowball.
One of the automations that we put live essentially monitors updates to vendor contact information and conversations that are held with the vendors in our agencies and uploads that to a web database. Then, those conversations are logged in an Excel Spreadsheet that managers like to keep updated. This is roughly saving, on average, about 40 hours a week. Multiply that by 52 work weeks, and it's a lot of savings. It's roughly the cost of two full-time salaries that we are saving.
reviewer1986756 says in an UiPath review
Director of global process improvement and automation at a manufacturing company with 1,001-5,000 employees
We did AP automation. We started with AP invoice automation first. Then we did a proof of shipment delivery. That was the second concept. After, we expanded to the inputting of contractor timesheets. We moved to publish the metrics on a dashboard and log in to different databases. Slowly, the demand picked up in the last year.
reviewer1426251 says in a WorkFusion review
Robotics Support at a financial services firm with 10,001+ employees
Throughput monitoring and database bottlenecks are areas that we are working on that need some improvement.
I started with invoice processing. That uses a lot of different parts of the WorkFusion solution. It uses their built-in OCR package, built-in S3 instance, and built-in databases. Invoice processing is the big use case.
We use their AutoML tagging machine learning use cases. Then, at the end, we use their true robotic process automation, using the SeleniumLibrary, to enter things into our ERP system.
I have used it for a couple of other projects. I have done this mostly because WorkFusion makes it easy to code out a straightforward business process, put it on a schedule, and hook it up to a database without needing to requisition all new things and servers to launch it. I can host it in one central spot.
Our particular instance is on-prem. It has five or six different Linux servers that all talk to each other as well as a handful of Windows machines that end up talking to the central Linux boxes. In my time here, we have had two different major versions of WorkFusion: 8.5 and 9.2. We are currently on version 9.2.
reviewer1779444 says in a WorkFusion review
Intelligent Automation Manager at a retailer with 10,001+ employees
The ability to stitch together distinct pieces of a process into a larger workflow is the most valuable feature. I could emulate grabbing input from a screen, from an application, database, or Excel or a CSV file, then take that information and do a ton of different manipulations to it using rules and code, as necessary. Then, I can touch multiple other systems and grab more data or enter data into them, whether it be databases or front-end applications. That flexibility is huge in enabling us to really make a good impact with our automation.
The document processing engine is capable. We have a small number of invoices. For the most part, it works pretty well to process a document, extract the information, and then make updates in our accounting software. Invoices by their nature are structured documents. We only use it for certain vendors, so we are not just throwing everything at it.
It is not bad for structured documents. We just don't use it enough to be able to cheerlead for it. However, where we have used the document processing engine for structured documents, it seems to do a good job.
WorkFusion does a lot of different things. In terms of workspace and document tagging, it is not bad and pretty intuitive. With a little bit of training, people generally pick it up and are off to the races. Other than that, we don't generally let non-technical users touch it.
reviewer1830315 says in a WorkFusion review
Director of Automation at a consultancy with 51-200 employees
If WorkFusion goes down, it is a big problem. If it is not accurate, then we will have issues as well. We need to get the right freight and clear the right freight. The robots have helped increase the rate of efficiency by 75% to 80% from a customer perspective and 30% from an order entry perspective.
We had some issues in the past, but it is fairly stable now. From what I hear, it is more stable in the later versions. With version 10.1.4, there were some issues relating to load distribution. However, it is good now. Their support and infrastructure teams have been able to help fix those issues. We have had some downtimes in the past, but their support team has been fast to help out with them.
The solution requires a DBA to clean up the database, reindex, and update the statistics in the database. It requires maintenance there. There is also the monitoring of the AWS service in general.
Kryon RPA: Database
Process Discovery is brand new. We are on 19.1 for it. I know on 19.2 that they change the architecture completely. From talking to some other companies that use this, it sounds like we are missing some pretty big features that we will need. We will be doing an upgrade here in the near term. In general, Kryon's upgrade process is basically uninstall/reinstall at this point. They don't have an easy way to upgrade the software in place, which would be an added benefit. The process is not difficult. We have just a handful of robot machines with high availability enabled, along with a couple of app servers and a couple of database servers. Still, that's 12 machines which all need to be upgraded, and that is no simple effort when you're talking about a full reinstalled software. If we fully utilized our licensing by scaling out, we'd have more than 30 machines. We would have to upgrade on the field. Then, it's a pretty big task if they release new features and things that we want to take advantage of going forward. I see an opportunity for improvement from them here.
We are using it to do some automated reporting, and right now, we can't put images into the HTML formatted body of an email. We can either attach an image or embed a link to an image, but we can't just drop in an image into the HTML. That is feedback that I have given them. It would be nice to be able to have a bot take an image and paste it in as you would in Outlook. Otherwise, we have to provide either those images on a public website. If we want them just to be attached, it just doesn't look as clean. Especially people in our sales force, who are on their mobiles a lot, are not on our network a lot. These are a challenge for them when they just want to be able to glance at the report and go on with their day. It seems like a small problem, but it's limiting for us in some of the areas where we could deploy more of this solution. We have a feature request in for this, and I'm hoping it will be included in the future.
How it delineates file names on email attachments could be better. It is a hard coded comma, and if somebody includes a comment in a file name that messes it up. It is such a ridiculous thing. Who puts commas in file names? But, you would be surprised, and it happens. This is another simple thing they could easily tweak.
Their Tier 1 support is pretty basic. You either have to jump through the same hoops every time or escalate to a different Tier through your rep.
Reviewer46830 says in a Kryon RPA review
Senior Systems Analyst RPA at a hospitality company with 10,001+ employees
We have relied heavily on the the data group base integration features, as well as the email and Exchange integration features. The ability to integrate with Office makes life a lot easier. One of the things that we do is interact with a lot of Excel worksheets and their information without having to load up the Excel worksheets themselves. Instead, we are able to pull the data directly from them and do the manipulations that we need, then put them back into the spreadsheet or into another worksheet without having to wait for Excel to load.
The ability to integrate with a database is a big perk from a scalability standpoint. Our automated processes for our rate programs are driven by database entry. We also use databases for our queuing systems and reporting purposes. It is proving to be the backbone of our analytics side of tracking and monitoring our automated processes and systems.
We have been using Kryon Process Discovery for about a year now. With our initial tests and monitoring just a couple people, we were able to identify 79 possible use cases within a one-month period.
For developers, Kryon is quite easy to use. I come from a scripting background. Adapting from scripting over to working in Kryon's development environment is very simple. It was an easy adjustment.
It is easy to move things through testing and into production. From a development standpoint, it is a pretty streamlined, straightforward process.
Fortra Automate: Database
reviewer1652418 says in a Fortra Automate review
Clinical business analyst senior at a healthcare company with 5,001-10,000 employees
We primarily use the solution either for uploading and downloading files on vendor SFTP sites, reading data and files and looking at files to alert and take action, and internally moving files such as data extract and reports from one destination to another, including moving into a network folder or server.
We also use it for data transformation and manipulation, such as converting files from text to Excel or CSV, unzipping files, leading rows and columns, reformatting rows and columns, and combining files.
And then we also use the solution to connect them to databases to execute SQL statements to produce reports.
We use HelpSystems Automate primarily for file manipulation, such as moving files from folder to folder and tasks that other tools can't do without some programming.
When files come in, we want to send them to specific locations based on their names and based on the results of IF statements. The solution makes this kind of work more straightforward and more drag and drop.
We also use the product to move Excel spreadsheets or PDF files and to convert PDF files; it's a powerful tool for transferring information. It has a playback feature, but we don't use that in our current environment.
We have two servers running the solution; one for development and one for production. Going from development to production is as simple as dragging the object over and putting it into production with little to no change, which makes promotion straightforward.
We get a data file comprised of multiple reports, and it's a text file. We use the solution to split the text file into individual reports, and then we can drop them into a folder, and they get picked up by another tool, OnBase. That is an essential task for us.
We also use the tool for retrieving data files and sending them out via AWS or Google Cloud, as it has cloud-based capabilities.
The tool works behind the scenes; we created a process that reads many data files, matches them to a SQL database and moves them to the correct folder while collecting information required for other processes further down the road. That's mainly what we use it for, and I'm responsible for maintaining it. Automate makes my job easier because I don't have to keep rewriting scripts or changing file names; we can quickly change a database file, and the solution picks that information up and processes it.
For example, we get a file that has ten reports in it. We take that file and run it through Automate, which reads the data file line by line and breaks it up. Line one through 1000 is one report, and 1001 may be a second report. Automate handles that repeatedly very well. It then reads the database to determine what the report is and where it is supposed to go. If a report doesn't match any database parameters, it's simple to go into the report in Automate and add the condition.
ABBYY Vantage: Database
reviewer1647750 says in an ABBYY Vantage review
IT Consultant at a financial services firm with 10,001+ employees
I specialize in RPA. So, I started learning FlexiCapture to include it in the automation process. Instead of users having to manually enter information into a database or something like that, they can just load their digital documents to a certain location for ABBYY to decipher them, and then the bot and the automation could just continue along with that.
For one of the projects, it was deployed on a public cloud. For the other project, it was on-premise.
What I found most beneficial from VisualCron is that you can directly connect it with different databases, user accounts, and credentials. You can run queries from one machine to another using the credentials, unlike PowerShell, that's run by the machine itself. With VisualCron, you can directly ask other machines and run requests through user credentials.
Suppose you have an AWS or Azure environment. In that case, you can maintain the connection between on-premises and cloud-based machines, which is the type of setup in the organization I'm working for. You can directly call some of the services via VisualCron on your local on-premises machine, and I found that an excellent feature of the tool.
Microsoft Power Automate: Database
reviewer1225296 says in a Microsoft Power Automate review
Practice Principal - Cloud and Automation at a tech services company with 51-200 employees
The price depends on the features that we are using.
The licensing cost for us at this time is between $8 and $20 per user, per month.
It's a monthly cost for every user that touches one of the flows or is kicking off a workflow.
Licensing can get expensive.
There are premium connectors, where if you want to connect to external data sources, there is an additional cost for that.
I think one of the big issues was for an Azure SQL database or for SQL databases that used to be part of the standard connectors, and then they converted those to premium connectors, which increases the cost and limits the functionality for what you would be paying for it.
reviewer1342437 says in a Microsoft Power Automate review
Director - Analytics and Data at a tech services company with 11-50 employees
I would like to see more integration with the desktop application, and on-premise server as well as the SQL database. That would be really nice.
It would be great if you could integrate outside of Microsoft environments.
The initial setup can be a bit tricky if it is a complex environment.
The product needs a clear integration between the UI recorder and the steps that you do and the codes and the qualification that you do on the RPA there.
reviewer1521363 says in a Microsoft Power Automate review
Digital Strategy Manager at a energy/utilities company with 10,001+ employees
Microsoft shouldn't charge extra for the database license if you want to store the data in the database during the trial. We wanted to have a historical trend of the data, and we started with the trial version of the tool. The database license is not included with the trial version, and you have to purchase it separately. Because we had a budget constraint, we had to pull all the information manually from the system, massage it, and push it to the dashboard. About two months ago, we have upgraded to the full-fledged version in which the database is integrated. The database license should be there in the trial version, but they have totally decoupled it. They should have provided a bundle, at least for the trial version, so that once a person or a firm gets a sense of it, they can start building. It might be because they wanted to sell additional licenses or premium licenses, and that's why they have added it in the premium version.
It should have more cognitive features. Automation Anywhere and UiPath are different because they have cognitive functionality plus intelligent automation. The cognitive functionality is currently not there in Microsoft Power Automate. It is just for workflow automation and basic bot-level tasks. It should have more cognitive features, which probably will be launched in a couple of years.
Workflow management is what clients select the most. It is very intuitive and pretty much drag-and-drop, so we can create escalation, decision flows, and if-else conditions pretty much by dragging and dropping boxes. Even someone who is not technical can develop a workflow for the business.
It is very easy to use. It doesn't need a technical resource to create and maintain forms. The UX/UI is very similar throughout the Microsoft platform, including SharePoint and Office 365.
I would not recommend Power Automate to all organizations because whether or not it will work out for them will depend on a number of factors. If the organization is all in one Microsoft environment or well-connected with the cloud, it can be really good. But, the organization will have quite some costs if in a Google or Amazon environment.
If an organization uses Power Automate, it is best if they use Power Automate's other tools like Power BI and Power Apps. That's why I think the pricing could be lower as you should be able to get several solutions in a bundle. This should be an option during relicensing, for example.
Someone using Power Automate should also have a basic understanding of databases, Power Apps, Power BI, and Azure. You have to know the full environment to get the most out of this tool. In other words, training is mandatory.
Rania Aouad says in a Microsoft Power Automate review
Business consultant, Head of RPA Solutions at Bsynchro Sal
During their merger, some key functions were eliminated which no longer allows it to run independently from the cloud as a standalone for other companies. For example, options to schedule and trigger have been removed. It would be nice to have these capabilities back.
The flow and the connectivity between databases and so on is a bit complex as well.
With the desktop version, we can automate many things. The ability to do a lot of things is one of the most valuable features.
For our clients with web-based reporting systems that don't have APIs. There was no other way to get the data into the systems than to manually enter it. We were able to automate that process entirely. We now collect information on a front-end power app, store it in a cloud database, and then push it into these external systems.
The fact that we can do that for them is probably the most powerful piece for the people.
We have removed an entire repetitive business process that didn't add any real value to the business but was required for compliance reasons.
I would advise paying critical attention to the environment that you're setting up. User access roles, either through Active Directory or through the database control method, should be the key focus. After that, you need to assign roles and licenses as necessary. From there on, you need to integrate the system.
The Microsoft documentation portal for both cloud and on-premises is going to be the easiest to follow. All the solutions are there. For technical assistance from the Microsoft side, contact details are available on the documentation portal for any type of query.
I would rate it a solid eight out of ten. For me to give it a ten, there should be seamless integration between both the cloud and the on-premises solutions. There should be the exact same or similar functionality between the two to make the entire automation process a bit more streamlined.
Inflectra Rapise: Database
We always use the product for end-to-end automation test cases, never for unit automation tests. We use it to automate functional testing of end-to-end test cases for evaluating the impact on the databases and to override the time that we need to use the front end. That is all.
Moreover, I have found the correlation of the values for Object ID to be valuable. These are two of the most important features that are very well managed.