Airflow 2.8.3.0 - 2404 (EEP 9.2.2) Release Notes
The following notes relate specifically to the HPE Data Fabric Distribution for Apache
Airflow. You may also be interested in the Apache Airflow home page.
| Airflow Version | 2.8.3.0 |
| Release Date | April 2024 |
| HPE Version Interoperability | See Ecosystem Pack Components and OS Support. |
| Source on GitHub | https://github.com/mapr/airflow |
| GitHub Release Tag | 2.8.3.0-eep-922 |
| Maven Artifacts | https://repository.mapr.com/maven/ |
| Package Names | Navigate to http://package.ezmeral.hpe.com/releases/MEP/, and select your EEP(MEP) and OS to view the list of package names. |
| Documentation |
New in This Release
- This release updates the Airflow component to version 2.8.3.0.
- Introduced 2 new options:
admin_only_cli_access: A property to limit Airflow CLI access to only the admin cluster user. Set totrueto limit Airflow CLI only for the admin cluster user. This property disables impersonation functionality.admin_cli_with_impersonation: A property to limit Airflow CLI access to only the admin cluster user, except forairflow taskscommands. Supports impersonation when the property is set totrue. This property has lower priority thanadmin_only_cli_access.
- If the
logrotationtool is installed on the cluster, Airflow copies its own configuration to thelogrotationconf files. Then the webserver and scheduler log files are rotated daily by default.
Fixes
-
Fix incorrect serialization of FixedTimezone (#38139)
-
Fix excessive permission changing for log task handler (#38164)
-
Fix task instances list link (#38096)
-
Fix a bug where scheduler heartrate parameter was not used (#37992)
-
Add padding to prevent grid horizontal scroll overlapping tasks (#37942)
-
Fix hash caching in ObjectStoragePath (#37769)
Known Issues and Limitations
- The Installer can install Airflow, but cannot set up MySQL as the backend database for Airflow. The default Airflow database is SQLite.
- Apache PySpark has many CVEs and is removed from the default Airflow dependencies. To
use the Spark JDBC operator/hook from Apache, install PySpark as follows:
- Run
<airflow_home>/build/env/bin/activate. - Run
pip install pyspark==3.3.2. - Run
deactivate. -
NOTEThis process does not affect the Ezmeral Spark provider.
- Run
- If the
repair_pip_depends.shscript failed with the following error, you must run the script again:subprocess.CalledProcessError: Command 'krb5-config --libs gssapi' returned non-zero exit status 127. [end of output]
Resolved Issues
None.