HUE-7248 [adls] fix export to file

Review Request #11704 - Created Oct. 3, 2017 and submitted

Jean-Francois Desjeans Gauthier
hue
HUE-7248
HUE-7248
hue
jgauthier
commit a2f0f2cf08427cebc78dc1f93f0ba68c8d935ae9
Author: jdesjean <jgauthier@cloudera.com>
Date:   Tue Oct 3 17:21:59 2017 -0700

    HUE-7248 [adls] fix export to file

:100644 100644 d23e444c3a... 0a12e35576... M	desktop/libs/azure/src/azure/adls/webhdfs.py
:100644 100644 ad03ee1cbb... d8e77ff841... M	desktop/libs/notebook/src/notebook/connectors/hiveserver2.py


  • 0
  • 0
  • 4
  • 1
  • 5
Description From Last Updated
  1. 
      
  2. Would you have an example of URL that we want to fix?

    (to understand better the issue)

    1. This converts a URL with a network name to a local one
      e.g.
      from
      adl://jzhugeadls.azuredatalakestore.net/users/jgauthier/file.csv
      to
      adl:/users/jgauthier/file.csv

      The network name is configured in the ini file and cannot be changed at runtime. Hadoop needs the network name, but because it cannot be changed at runtime, we make it transparent to user.

  3. 
      
  1. 
      
  2. Should we do it only in the case of ADLS?
    
    Is _replace a standard Python API?
    
    Add new tests for these formats in https://github.com/cloudera/hue/blob/master/desktop/libs/notebook/src/notebook/tests.py#L358 ?
  3. 
      
  1. 
      
  2. No need to give request?

    self.request.

    https://github.com/cloudera/hue/blob/master/desktop/libs/notebook/src/notebook/connectors/base.py#L297

  3. 
      
  1. 
      
  2. Won't be an issue without? (as we go in _data_as_hdfs_file and not export_large_data_to_hdfs)
    1. _data_as_hdfs_file is done via webhdfs which does not require net name in path (HTTP URL already contains net name).
      export_large_data_to_hdfs is done via Hive query and requires net name

    2. ping

  3. 
      
  1. Nice one!

  2. 
      
Review request changed

Status: Closed (submitted)

Loading...