GoRules is an open-source business rules engine that prioritizes business user experience, performance and reliability. It enables you to create rules, and manage multiple versions across multiple workspaces.
GoRules is optimized to provide a common language between IT and business, through:
Decision Graphs - Build visually stunning decision graphs that are easily understood by both business users and developers.
Decision Tables - Simplify business rules management using spreadsheets, with business users taking the lead.
Edge functions - Add custom business logic to workflows that is tailored to your organization's unique requirements.
The file-based system is designed to help you optimize your productivity. Revolutionize your productivity with the drag-and-drop rule builder and user-friendly spreadsheets. Organizing and working across multiple teams has never been easier.
The engine's core is written in Rust and available in multiple languages through bindings. Supported languages include: Rust, Node.js and Python with more to come.
Scale to over 10,000 requests per second on-premise. The deployment can be done on all 3 major players: AWS, GCP and Azure. Alternatively, you may choose Enterprise Cloud.
Based on our record, Apache Spark seems to be a lot more popular than GoRules.io. While we know about 70 links to Apache Spark, we've tracked only 2 mentions of GoRules.io. We are tracking product recommendations and mentions on various public social media platforms and blogs. They can help you identify which product is more popular and what people think of it.
On a serious note: We bought gorules.io domain with initial plans for using GoLang, however after a while, the name stuck with us and our clients, and it felt difficult to go back on something we were used to. We don't associate GoLang with the engine, but we do plan support for it sometime soon (via FFI). Source: about 2 years ago
GoRules is a modern, open-source rules engine designed for high performance and scalability. Our mission is to democratise rules engines and drive early adoption. Rules engines are very useful as they allow business users to easily understand and modify core business logic with little help from developers. You can think of us as a modern, less memory-hungry version of Drools that will be available in many... Source: about 2 years ago
Apache Iceberg defines a table format that separates how data is stored from how data is queried. Any engine that implements the Iceberg integration — Spark, Flink, Trino, DuckDB, Snowflake, RisingWave — can read and/or write Iceberg data directly. - Source: dev.to / about 2 months ago
Apache Spark powers large-scale data analytics and machine learning, but as workloads grow exponentially, traditional static resource allocation leads to 30–50% resource waste due to idle Executors and suboptimal instance selection. - Source: dev.to / about 2 months ago
One of the key attributes of Apache License 2.0 is its flexible nature. Permitting use in both proprietary and open source environments, it has become the go-to choice for innovative projects ranging from the Apache HTTP Server to large-scale initiatives like Apache Spark and Hadoop. This flexibility is not solely legal; it is also philosophical. The license is designed to encourage transparency and maintain a... - Source: dev.to / 3 months ago
[1] S. Russell and P. Norvig, Artificial Intelligence: A Modern Approach. Pearson, 2020. [2] F. Chollet, Deep Learning with Python. Manning Publications, 2018. [3] C. C. Aggarwal, Data Mining: The Textbook. Springer, 2015. [4] J. Dean and S. Ghemawat, "MapReduce: Simplified Data Processing on Large Clusters," Communications of the ACM, vol. 51, no. 1, pp. 107-113, 2008. [5] Apache Software Foundation, "Apache... - Source: dev.to / 3 months ago
If you're designing an event-based pipeline, you can use a data streaming tool like Kafka to process data as it's collected by the pipeline. For a setup that already has data stored, you can use tools like Apache Spark to batch process and clean it before moving ahead with the pipeline. - Source: dev.to / 4 months ago
Drools - Drools introduces the Business Logic integration Platform which provides a unified and integrated platform for Rules, Workflow and Event Processing.
Apache Flink - Flink is a streaming dataflow engine that provides data distribution, communication, and fault tolerance for distributed computations.
DecisionRules.io - Business rule engine that lets you create and deploy business rules, while all your rules run in a secure and scalable cloud. Unlike other rule engines, you can create your first rule in 5 minutes and make 100k decisions in a minute via API.
Hadoop - Open-source software for reliable, scalable, distributed computing
OptaPlanner - Mathematical optimization software
Apache Storm - Apache Storm is a free and open source distributed realtime computation system.