Microsoft's Azure Data Factory is a service built for all data integration needs and skill levels. It is designed to allow the user to easily construct ETL and ELT processes code-free within the intuitive visual environment, or write one's own code. Visually integrate data sources using more than 80 natively built and maintenance-free connectors at no added cost. Focus on data—the serverless integration service does the rest.
N/A
Dataloader.io
Score 10.0 out of 10
N/A
Dataloader.io delivers a cloud based solution to import and export information from Salesforce.
$99
per month
Pricing
Azure Data Factory
Dataloader.io
Editions & Modules
No answers on this topic
Professional
$99.00
per month
Enterprise
$299.00
per month
Offerings
Pricing Offerings
Azure Data Factory
Dataloader.io
Free Trial
No
No
Free/Freemium Version
No
No
Premium Consulting/Integration Services
No
No
Entry-level Setup Fee
No setup fee
No setup fee
Additional Details
—
—
More Pricing Information
Community Pulse
Azure Data Factory
Dataloader.io
Features
Azure Data Factory
Dataloader.io
Data Source Connection
Comparison of Data Source Connection features of Product A and Product B
Azure Data Factory
8.5
10 Ratings
3% above category average
Dataloader.io
-
Ratings
Connect to traditional data sources
9.010 Ratings
00 Ratings
Connecto to Big Data and NoSQL
8.010 Ratings
00 Ratings
Data Transformations
Comparison of Data Transformations features of Product A and Product B
Azure Data Factory
7.8
10 Ratings
3% below category average
Dataloader.io
-
Ratings
Simple transformations
8.710 Ratings
00 Ratings
Complex transformations
7.010 Ratings
00 Ratings
Data Modeling
Comparison of Data Modeling features of Product A and Product B
Azure Data Factory
6.3
10 Ratings
21% below category average
Dataloader.io
-
Ratings
Data model creation
4.57 Ratings
00 Ratings
Metadata management
5.58 Ratings
00 Ratings
Business rules and workflow
6.010 Ratings
00 Ratings
Collaboration
7.09 Ratings
00 Ratings
Testing and debugging
6.310 Ratings
00 Ratings
Data Governance
Comparison of Data Governance features of Product A and Product B
Best scenario is for ETL process. The flexibility and connectivity is outstanding. For our environment, SAP data connectivity with Azure Data Factory offers very limited features compared to SAP Data Sphere. Due to the limited modelling capacity of the tool, we use Databricks for data modelling and cleaning. Usage of multiple tools could have been avoided if adf has modelling capabilities.
Replacing data. If we've put something in a category or a bucket that is no longer named that anymore because we've evolved with the times and we want to rebrand everything, it makes it way easier to do a quick import with the new terms.
Extracting Salesforce attachments in original file format! I do not know of a tool that can do this better, or more efficiently! This is a huge benefit to companies that would like to extract attachments from Salesforce for tasks like data migrations.
Cross-object data extract within one file. You can pull data from related objects as long as there is a populated lookup from the object you are extracting, to another object (Child or Parent).
UI is simple and requires very little to no training. Given the acquisition of Mulesoft by Salesforce, I would not be surprised if DataLoader.IO is rolled out as the new global data loading tool for Salesforce.
Granularity of Errors: Sometimes, Azure Data Factory provides error messages that are too generic or vague for us, making it challenging to pinpoint the exact cause of a pipeline failure. Enhanced error messages with more actionable details would greatly assist us as users in debugging their pipelines.
Pipeline Design UI: In my experience, the visual interface for designing pipelines, especially when dealing with complex workflows or numerous activities, can become cluttered. I think a more intuitive and scalable design interface would improve usability. In my opinion, features like zoom, better alignment tools, or grouping capabilities could make managing intricate designs more manageable.
Native Support: While Azure Data Factory does support incremental data loads, in my experience, the setup can be somewhat manual and complex. I think native and more straightforward support for Change Data Capture, especially from popular databases, would simplify the process of capturing and processing only the changed data, making regular data updates more efficient
At the moment, I can't find a way to rename jobs. This would be useful to organize what was previously created hastily by techs in a rush.
A preview of the job, especially upserts, would take a great deal of stress away from some of us (especially those who are not so confident in their ETL practice).
A native vlookup equivalent may be a welcome addition.
It is easy to use and doesn't require a security token, so I enjoy using it. It also doesn't require any download or installation, which is sometimes a blocker to gettingthings done if the company has limits. also, the dataloader.io is easy for other people to pick up, so others can have visibility into the data jobs that have occurred
So far product has performed as expected. We were noticing some performance issues, but they were largely Synapse related. This has led to a shift from Synapse to Databricks. Overall this has delayed our analytic platform. Once databricks becomes fully operational, Azure Data Factory will be critical to our environment and future success.
Dataloader definitely skews towards a more technical userbase. Users should be adept at manipulating data in spreadsheets and decipher JSON formatted error messaging. Additionally, there is a good amount of time need to set up the environment to map to the pertinent fields we are trying to adjust. While I would not recommend the typical account manager to use Dataloader, a typical operations manager should have no issue.
We have not had need to engage with Microsoft much on Azure Data Factory, but they have been responsive and helpful when needed. This being said, we have not had a major emergency or outage requiring their intervention. The score of seven is a representation that they have done well for now, but have not proved out their support for a significant issue
The utility itself is very self-explanatory and has enough information to guide you through the process. It has an intuitive experience for those familiar with data loading/exporting utilities. Outside of this, they have a Zendesk help center to log support requests and provide documentation to help guide you troubleshoot any issues that may be occurring.
Azure Data Factory helps us automate to schedule jobs as per customer demands to make ETL triggers when the need arises. Anyone can define the workflow with the Azure Data Factory UI designer tool and easily test the systems. It helped us automate the same workflow with programming languages like Python or automation tools like ansible. Numerous options for connectivity be it a database or storage account helps us move data transfer to the cloud or on-premise systems.
I have used salesforce inspector also for operations like import and export of data from custom objects but it doesn't work well when you have data in huge numbers. Instead of using Salesforce Inspector, one should go for Dataloader.io if the number of records is huge to be dealt with.
HUGE time saving. When we need to clean or review data, we used to have to do it line by line. This can do the work within excel and make cleanup/management an afternoons work as opposed to a week.
Rollback what you did/change/deleted is relatively simple if you remember to back up the data you are manipulating.