Using Views data export, is there a way to made the *.txt file that is provided as a file to download, directly saved on the webserver side into a repertory ?

We'd like to have a rule making that each time a content is published, Views data export exports some stuff of that published node into a local (server side) repertory.

And for the moment, choosing 'provide' as a file gives only the path to a downloadable file, but it looks that nothing is saved by Drupal.

Thanks for help

Members fund testing for the Drupal project. Drupal Association Learn more


johnv’s picture

Steven Jones’s picture

Status: Active » Fixed

Sorry, saving the exported file to somewhere permanent isn't a supported feature.

Status: Fixed » Closed (fixed)

Automatically closed -- issue fixed for 2 weeks with no activity.

mostou’s picture

I was looking for a similar functionality, since I am not able to use Drush with our current hosting provider. I found another way by using rules and "Execute custom PHP code":

$Output = views_embed_view('ViewName', 'views_data_export_1');
file_unmanaged_save_data($Output , 'sites/default/files/private/output.xls', FILE_EXISTS_REPLACE);

Just to share my experience with others with the same need without the possibility of using Drush.

dbourrion’s picture

Wow, that one is a smart way :)
Love it

johnv’s picture

Title: Saving the txt file » How to programmatically save a views data export file to a repository/a node field/etc.
Component: Code » Documentation
Status: Closed (fixed) » Active

I'd like this re-open this issue, after reading #4.
The goal would be some documentation of how to do this in a few lines of custom code.
As Steven Jones explained in this and other issues, this request is out-of-scope of the module. However, a cook-book with how-to recipes would help a lot for use cases as mentioned in #1.
We may end up with some code templates. The lines in #4 could be a start for that.

Steven Jones’s picture

Title: Saving the txt file » How to programmatically save a views data export file to a repository/a node field/etc.
Component: Code » Documentation

Okay okay, here's an idea:

Create a new module, and have it provide Rules integration to render a views data export to a file or something, but basically, keep in a module, but not in this module :)
If you want me to create the module/project and then give you commit access I can.

This would be better and easier than a recipe I think?

johnv’s picture

OK, but let me work on some working code, first. I'm not sure about the achitecture of such a module.
Neither of the name: views_data_export_attachments? views_data_export_file?

I'm posting my ideas for some crowd sourcing...

use case 1: save file on server upon cron job using elysia_cron's hook_cronapi().
1. the user creates a .cron.inc file for each view to be exported. It contains:
- a _cronapi() settings function; This allows you to schedule the job using Elysia cron.
- a _cron() function; This calls the view. It looks like below
2. The module contains a .module file, with 1 function:
- a _cronapi() function, that reads/includes all '.cron.inc' files and calls each *_cronapi() function. (This is like Views works: scraping all '.views.inc' files in a directory.)
3. a UI for non-programmers?
- A multivalue field, where each record is a set of data from hook_cronapi.
- A multivalue field, where each record is a textarea, where you can type in the array.

use case 2a: save a file on server upon a node save/update/delete
use case 2b: mail a file upon a node save/update/delete

use case 3: save a file on a node's field upon a node save/update/delete
The hook_cronapi settings (mentioned above) can be extended with settings, like the hook you need, etc.
- using rules (new territory for me).
- using hook_entity_. A function fires every VDE that contains the correct hook_name.
Still need to work on:
- how to pass the $node->id
- how to add the report to the node.

Falk_SB’s picture

I tried the solution #4 suggested, but without success…
The Rules Log looks fine, but no file is created.
I don’t get any error messages by the way.

Any help would be appreciated

mostou’s picture

My use case - schedule sending a list (views-data-export) to the board of an organization (via rules scheduler/component).

I ended up using this code instead in a rules component:

$view = views_get_view('view_machine_name');
$display = $view->preview('view_display_name');
file_unmanaged_save_data($display , 'private://display.xls', FILE_EXISTS_REPLACE);

view_machine_name: is the machine name of the view
view_display_name: is the machine name of the display (hover the mouse over the display in order to see the machine name)

file_unmanaged_save_data requires a stream wrapper URI see: https://api.drupal.org/api/drupal/includes%21file.inc/function/file_unma...

views_embed_view performs access check on the view so if you have access restriction in your view, cron job will not be able to read from the view when using views_embed_view. The above code does basically the same as views_embed_view just without access control and args.

After sending the file attached in an email I am deleting the file again in another php-snippet:


For my use-case I think a module only for this would be a bit overkill since I only need a few lines of php-code. Just my opinion.

TravisJohnston’s picture

I am in dire need of this. I have a course/event registration website with Commerce and I need an end-of-day report of orders/registrations. I have tried with Cron with no luck. And I have an Acquia server that doesn't support some of the standard email attachment scripts.

With the code displayed in #10, You say you did t with a rule component, but what was the condition? And how are you calling it? I've only made rule components for View Bulk Operations.

TravisJohnston’s picture

Using the information in #4 I was able to manually execute the Rules Component I made with the code you provided. It successfully downloaded the CSV!!! But I need to trigger it nightly. I have setup Rules Schedule to execute the trigger at "tomorrow midnight" so I just need to wait and see if that works and if that timestamp works everyday, not just today.

mostou’s picture

You are on the right track. I used Rules Schedule as well. Rules Schedule only triggers on cron-run - so if you have set the component to be executed at midnight and if next cron run is at 2am, the component will be executed at 2am.

You can of course fine tune your cron settings - for instance with Elysia cron and calling the cron-script from outside of Drupal. Drupal cron in D7 will trigger at the interval specified and when the site is visited (Poormans cron). In my case the site was not visited that often, so I setup crontab at my hosting provider to call the cron-script every hour - and with Elysia cron I specified to call Rules Schedule every hour and other system maintenance task to be executed once per day (at night).

TravisJohnston’s picture

So I set it up to also send me an email notification about the download, with a link to the file, and I received the email around 2:30 this morning! Awesome! Now I just need to see if it runs every night.

The next crucial piece is to set it up to only download if there is something new, is possible. Since the end result is the CSV is going to be uploaded into FileMaker for the morning staff to review orders. So it would be useful if it only downloaded if there are new orders from that day. But that's not a big issue.

Thanks for the help! I've been dealing with this issue for months...

johnv’s picture

See this link for a script to download a file, and add it to a node:

joncjordan’s picture

I fine-tuned #4 for my purposes. I needed to save individual xml files for each node to a folder on the server. In my view I added the contextual filter "Content: Nid." Then I created the rule: After saving new content > Content is of type X > Execute custom PHP code. I added the node ID argument via replacement patterns to the code from #4 and voila, I got individual XML files for each node.

$Output = views_embed_view('xml', 'views_data_export_1', [node:nid]);
file_unmanaged_save_data($Output , 'sites/default/files/exports/[node:nid].xml', FILE_EXISTS_REPLACE);
TravisJohnston’s picture

I managed to get this to work only if I manually press the Execute link next to the component. If I have it set to schedule, it does email me every midnight but the CSV only contains the headers, no data.

mostou’s picture

TravisJohnston - please read my post #10 regarding access control in views and the use of views_embed_view in a component executed by cron.

I ended up using the following instead of views_embed_view as mentioned in #10:
$view = views_get_view('view_machine_name');
$display = $view->preview('view_display_name');

If this is still an issue please post your code.

TravisJohnston’s picture

Hello Mostou,

I'm actually using the code you provided. After I click Execute and I get the email, I do see the data. But the cron version I only see the headers. Maybe I just need to check "Disable SQL Rewriting" on the export view?

{ "rules_download_registration_csv" : {
    "LABEL" : "Download Registration CSV",
    "PLUGIN" : "rule",
    "REQUIRES" : [ "php", "rules", "rules_scheduler" ],
    "DO" : [
      { "php_eval" : { "code" : "$view = views_get_view(\u0027registration_csv\u0027);\r\n$display = $view-\u003Epreview(\u0027views_data_export_1\u0027);\r\nfile_unmanaged_save_data($display , \u0027sites\/default\/files\/csv_dumps\/registration\/registration.csv\u0027, FILE_EXISTS_REPLACE);" } },
      { "mail" : {
          "to" : "tjohnston@example.org",
          "subject" : "Registration CSV has been downloaded on [site:current-date]",
          "message" : "Registration CSV has been downloaded at [site:current-date]\r\n\r\nhttp:\/\/www.example.org\/sites\/default\/files\/csv_dumps\/registration\/registration.csv",
          "language" : [ "" ]
      { "schedule" : {
          "component" : "rules_download_registration_csv",
          "date" : "midnight +1 day"
TravisJohnston’s picture

The SQL Rewriting had no affect. Since i can get it to work when I manually do it, I am also logged into the website in the same browser, which maybe why I can see the data. Going to try a different browser for executing than viewing.


Yup that reproduced it, I could no longer access the data in the csv if I am not logged into the website in the same browser. What's kinda weird though is that once I log in, and click the link in my email, I can download another copy that does show everything. But its the same file, its not being regenerated when I log in since I did not run the Execute again.

I would of thought the SQL rewrite would of worked since I have a bunch of relationships in the view which are permission based, so the SQL rewrite should of ignored that.

TravisJohnston’s picture

Would it be at all possible to set it up so the code runs as a user and instead of just sending a link to a file on the server, have it attached to the email? Without mutt or uencode.... I tried and Acquia servers do not support those.

TravisJohnston’s picture

So I have narrowed down the issue as cron runs as anonymous so the data in my view is not being captured. I can't allow the Distinct SQL (even though it didn't make a difference) because I don't want that view to be access by anyone other than particular users.

Is there a way in your php to run and send the file as an attachment instead, and run it as a user=1?

mostou’s picture

When using "views_get_view" and "$view->preview" it should in fact ignore the access control, so you shouldn't need to run cron as user=1 (which is not possible). In my case I have a restricted view as well - and the component works fine without any issues when executed by cron.

Have you tried to download the csv file from the server via a ftp client? Is the content of the downloaded csv file the same? Or do you still just get the headers of the csv file?

Regarding file_unmanaged_save_data the destination should be a stream rapper URI - see https://api.drupal.org/api/drupal/includes!file.inc/function/file_unmana...
So the destination should either begin with "private://" or "public://". If that can have an effect I don't know.

I use the module mime-mail to send the file as an attachment in the component. Afterwards I delete the file from the server via file_unmanaged_delete.

If you wan't to troubleshoot if it's the view - try and create a new simple view (for instance a list of nodes) and make the view restricted - and use that in your component. What behavior do you see in this case? Could you post an export of the code of that new simple view?

TravisJohnston’s picture

Issue summary: View changes
Status: Active » Needs work

I still can't get this to work unless I am logged in. And I find that it doesn't always download a new file either, sometimes I get the same file sent even though there are a number of new records that are supposed to be in it. For now, I have to be online and in the building when I import this data so I am just downloading the CSV manually.

This seems like a common request though so it would be really beneficial to build the feature into the module.

mostou’s picture

Please elaborate some more on your troubleshooting - otherwise it will be difficult to help you. You haven't answered my previous questions. From your post #20 I am a bit in doubt what your problem is. Try to divide your troubleshooting into two parts:

1. First you should verify that the data are exported correctly from your component via Rules Scheduler (Cron). Have you tried to let the cron job execute the component via Rules Scheduler and browse to the folder via a ftp client (for instance Filezilla)? Download the file via the ftp client and open it - is it the same result? Do you see other files in that folder? Try to delete any previous exported files from the folder and let cron execute the component again. Any difference in the result?

2. If the first step is working fine then it's just a matter on how to make the file available for others. If it's located in the private file folder then you should be logged in to download the file. Another option is to attach the file in an email via the Mime Mail module as I have mentioned in a previous post.

3. If you are still stuck: Try to create a new view (a simple view of for instance a list of nodes) and make the view access restricted. Use that in your component - what behaviour do you see in this case? You could also try with a fresh Drupal install and again make a simple access restricted view of nodes and use that in a newly created component.

The code I provided works fine for me - and also others have had success using the code. Basically when troubleshooting and you are stuck remember to keep it simple - try to make some simple variations of your configurations until you see the expected behaviour - in this way you can in most cases find the error or at least narrow it down.

TravisJohnston’s picture

Thanks mostou,

I am not in the office today but on Monday I will setup some different views to see if I can recreate it.

NWOM’s picture

Thank you so much for this. The only thing that I haven't been able to figure out is how to save the file in ANSI/WINDOWS-1252 encoding. Anyone have any ideas? Thanks in advanced.

I'm using the example below:

$view = views_get_view('view_machine_name');
$display = $view->preview('view_display_name');
file_unmanaged_save_data($display , 'private://display.xls', FILE_EXISTS_REPLACE);

zotogo999’s picture


Using #4 I am able to create a .csv file on my FTP server. I'd like it to contain some order information and be created each time and order is placed. The view ('Orders') creates the proper .csv file with the data I need, however when the file is generated by the 'Custom php' rule it is always blank. Here's the code I'm using.

$Output = views_embed_view('Orders', 'views_data_export_1');
file_unmanaged_save_data($Output , 'export/export.csv', FILE_EXISTS_RENAME);

Any ideas out there?

mostou’s picture

@ #28

Please read #10. I ended up using a slightly different code, since views_embed_view is checking for access restriction. For a scheduling component this will result in an empty file, if you have access restriction on your view, since cron is executed with anonymous rights. I don't know if this is the case in your scenario.

zotogo999’s picture

Hi mostou... thank you.

Unfortunately, even the code below, entered as 'Custom pHp' in a checkout rule, still generates a .csv file with headers and no data, even though the view shows the data exactly as I want it.

$view = views_get_view('commerce_orders');
$display = $view->preview('views_data_export_1');
file_unmanaged_save_data($display , '/exportdata.csv', FILE_EXISTS_RENAME);

I've deleted the rule and re-entered the custom pHp, to no avail.

Any ideas?

ARUN AK’s picture

I have tried the method mentioned in Comment #14 in my drupal 6 site. It is not working.

Any equivalent method available for drupal 6?

jasonhoward7’s picture

Enabling the "batch export" setting caused my export file to be blank. The file was populated with data by setting this back to "no".

miltonmethod’s picture


Did you ever find a solution for getting a csv of the commerce orders?

paultrotter50’s picture

I had this working perfectly to export a list of users fields. Unfortunately I must have changed some things because now when I revisited this part of my project I can see that the excel file is missing data for several users for several fields. The same data is excluded each time the view is saved programmatically. When saving the data manually by clicking the button at the bottom of the view all the data is included properly - so it must be an issue with saving programmatically. I can't work out what is wrong.

paultrotter50’s picture

Update to #34 It appears that if the user is a member of an organic group this stops some of their fields being exported. The fields that don't export to the excel are extra fields that I have added to the user accounts via admin/config/people/accounts/fields not core user fields. The core user fields like date created and date last logged in are always visible regardless of whether the user has an organic group. I have tried removing the organic group name from the view being exported, but this does not help and the fields are still missing for users with an organic group.

Does anyone have any ideas about what might be causing this or how to fix it?

Input’s picture

#10 Works like a charm for my .csv data export on D7 and Views Data Export 7.x-3.0-beta7

Countzero’s picture

Did any one of you managed to prevent the save/download dialog from appearing when embeding an export view in code ?

Countzero’s picture

For those interested in a brute force solution, here is what I did. Warning : it's absolutely unsubtle.

In views_data_export_plugin_style_export.inc, in function add_http_headers, comment the following lines :

   * Add any HTTP headers that this style plugin wants to.
  function add_http_headers() {

/*     drupal_add_http_header('Cache-Control', 'max-age=60, must-revalidate'); */

    if (!empty($this->definition['export headers'])) {
      foreach ($this->definition['export headers'] as $name => $value) {
/*         drupal_add_http_header($name, $value); */

    if (isset($this->options['filename']) && !empty($this->options['provide_file'])) {
      $filename = $this->generate_filename();

      if ($filename) {
/*         drupal_add_http_header('Content-Disposition', 'attachment; filename="'. $filename .'"'); */

Totally blunt, but efficient and quick if you just have to make a one shot export in a hurry without interrupting execution with save dialogs.
rozsasgergo’s picture

Hi everyone!

Comment #10 works perfectly, if you impersonate the site admin user first.
You can learn about impersonating a user here: https://www.drupal.org/node/218104

In the end the code run by the rule component should look something like this (D7 version):

global $user;
$original_user = $user;
$old_state = drupal_save_session();
$user = user_load(1);
// Take your action here where you pretend to be the user with UID = 1 (typically the admin user on a site)
// If your code fails, it's not a problem because the session will not be saved

$view = views_get_view('view_machine_name');
$display = $view->preview('view_display_name');
file_unmanaged_save_data($display , 'private://display.xls', FILE_EXISTS_REPLACE);

$user = $original_user;
// From here on the $user is back to normal so it's OK for the session to be saved

Cybertrail’s picture


I have a partial solution for saving files on a server from a Commerce Order. Turning on Disable SQL Rewrite in the Advance, Query settings solved the blank file for me.

The problem is that the latest order isn't added to the file. I think this is because the Rule is run before the database has completed updating the order.

Anyone have a suggestion of how to solve this?

Edit: I changed the weight so the other rules that ran on Completing the checkout process ran first and it solved the problem.

Mirakolous’s picture

#39 worked for me when I had a small list of products, but my list of just "Shoes" is 7.8 MB and timed out every time. Any suggestions?

aniket.mohite88’s picture

#10 worked for me. Thanks a lot mostou

Since, cron is run as anonymous, may be this is a proper way to call the views data export output.

  • Use the views_get_view
  • path should be start with "public://" or "private://"
aniket.mohite88’s picture

I tried executing this with rule & call the required view (& display) in the custom php of the rule. This results into nothing. No file is created.

The code worked fine on my local setup (XAMPP) but am not able to reproduce the output on online-server.

I changed my approach.

I added the code into a custom module to create a cron queue.
I am using ultimate cron & have set this queue task to run every 6hrs.
The view has about 1250 records. Which I need in a CSV file.

The view works fine on local setup again but not on the server.

I have tried the code mentioned in #4, #10 & #39.

I have taken the note of mentioning the path as "public://" as well. This does not seem to be issue of exactly running out of memory as well. Coz, my view now just picks up required fields from the DB & no operations are performed on the fields in views php or any other query.

The query is working just fine on the local setup. Is there something else that I could be doing wrong. Permissions or PHP memory size or execution time, in that case.

mostou’s picture

If no file is created - try without the views part and just with:
file_unmanaged_save_data('any string' , 'private://display.xls', FILE_EXISTS_REPLACE);

If no file is still not created it might be an access right issue.

aniket.mohite88’s picture

Thanks a lot mostou for such a quick response. Actually, after a lot of trial & error methods, I just took look at the "sites/default/files" folder. The permissions were right, just that the user for the folder was not "apache". Not sure whether this was the reason.

About your little piece of code, for testing the permission. Yes it is working just fine. The file is generated with the given code.

Thing is my views is quite resource munching.

  • About 15-16 fields (but none of them have any calculation using views-php or any query alters in custom code.)
  • The computations are done on the node save itself using computed fields
  • The view just calls these (6) computed fields & the rest are normal node fields.
  • About 1250 records are to be retrieved. This might grow (VERY SLOWLY) over time

Now when I execute the code (cron-queue), it shows memory size issues. Have attached image.
Is is ok to raise the PHP memory limit from 128MB to 256MB.

Also, in the views data export view, the "Batch Export" when set to NO, the view run in cron works properly, but not with the batch export set to YES (this works when i open the view page link & i get link to download the file. But does not work in the code. The code generates 0kb file). Is there an issue with a batch export set to yes when the code mentioned in #10 or #4 is used.

Again, Thanks for the immediate reply.

pdesai’s picture

I have the same request as aniket. Is there a way to get this to work with Batch Export set to yes? We have an XML view that takes 33 mins to generate using Batch Export.

nicholas.alipaz’s picture

I have done some of my own work on this and was able to save the file to the server using the batch export as follows:

In a custom module do:


function dvm_reports_views_data_export_batch_alter(&$batch, &$batch_redirect) {
  $view = views_get_view($batch['view_name']);

  // Add in another operation that will store the file on the server in a location
  // of our choosing.
  $batch['operations'][] = array(
    'dvm_reports_views_data_export_batch_finish', array($batch['eid']),

function dvm_reports_views_data_export_batch_finish($eid) {
  $export = views_data_export_get($eid);
  $view = views_data_export_view_retrieve($eid);
  $view->display_handler->batched_execution_state = $export;

  // Get path to temp file
  $temp_file = $view->display_handler->outputfile_path();
  file_unmanaged_copy($temp_file, 'public://testit2.csv', FILE_EXISTS_REPLACE);

You will then see that the results are saved to the testit2.csv file. This can easily be edited for most everyone's needs here. I would assume this would work with the aforementioned rules people are using.

tonystar’s picture

+1 who really needs this feature

aniket.mohite88’s picture

Hi Guys,

Sorry for the late reply, but my problem was solved by increasing the memory limit on the online server. Ultimate cron really helps to adjust the execution time. May be the views data export itself is a big task, coupled with the many fields that I was exporting per row.

Thanks for all the replies. Currently, my solution works fine when the "Batch Export" is set to NO.

Thanks, nicholas.alipaz for your code as well. Will surely try it sometimes later. But yeah, at-least people now have solution for both BATCH EXPORT options.

Thanks again.

tonystar’s picture

Increasing the memory limit is a bad idea. You couldn't do that every time your export require more resources.

There is a much better and elegant solution.

You can "run" your existing view via command line using Drush:

$ drush views-data-export VIEW_KEY DISPLAY_KEY OUTPUT_FILE

In this case your resources are not limited by web server restrictions.

For instructions refer to: http://drupal.tonystar.me/drush

aniket.mohite88’s picture

Thanks tonystar,

But in my case we had just increased the memory which was feasible solution for us.

Also, this view was required to be set to execute every 12hrs (twice a day), Hence, created a custom cron job (queue) for this view to execute & adjusted the cron job run with ultimate cron.

Surely, the drush method is noted as well.

maxplus’s picture

@joncjordan from #16:

You pass only one argument (nid) to your view. Any idea if it would be possible to pass the selected items during a VBO action to the view as multiple arguments?
That would be great.
In the meantime I'm using the solution described in #21 https://www.drupal.org/node/1436726


korlandi’s picture

Hi. I also really need this functionality. It is common for retailers/wholesalers to share inventory lists with other companies. In fact, without this feature, it's very, very difficult for websites who market inventory lists to work with you. I have never added or modified code for my drupal site before, but here I go...
Thanks for the contributions above. Fingers crossed.

joncjordan’s picture

@maxplus I think you should be able to add more than one argument separated by commas. See comment here: https://groups.drupal.org/node/17397#comment-59414

bwoods’s picture

In my case, I was trying to create a JSON file that's picked up by a feed on another site. #4 sort of worked, but it just printed the output to the screen. In the view, under Format/Settings, I checked the "Views API mode" box to force normal page processing. This created a separate issue, though, because the display wrapper (with divs) is then around it. To account for this, I ran a regex in the "Execute Custom PHP code" to get rid of the divs. This does the trick, and the file output is intact.

johnv’s picture

liland’s picture

#39 work great for 1-500 nodes. File created on server. But i need batched export for 500 and more nodes (2000-3000 on my site).

#47 work with manual start of batched export. Manualy press export, file created on server. But #47 doesn`t work with rules component from #39. How i can combine functions from #47 and #39 (or something else) to save my file on server with batched export, that will be run automatically every 12 hours?

When i try to run component #39 with batched export (with or without functions from #47) - i get only empty file.

nicholas.alipaz’s picture

liland, my guess is that rules is invoking something directly that seems to bypass the hooks the code I gave hooks into. Perhaps creating a rule that programmatically submits the form for the views data export in your installation would do it. There are a number of tutorials online for programmatically submitting a form in drupal. Perhaps give that a try?

mfernea’s picture

I wrote a patch to allow saving files in a permanent location. Please improve it according to your needs / thoughts.

mfernea’s picture

Here is a patch for beta9 version.

anthonys’s picture

I'm having trouble using the code in #4 as when I run views_embed_view() the file is provided as a transfer and Drupal exits, without letting me store the file as a variable. I'd like to save the file to an $output variable without a transfer being triggered. How are you guys preventing the transfer? It seems to happen whether or not I click the 'provide as file' setting.

The background is that I have a view set up to provide a .doc, which I would subsequently like to encode as MIME so that I can include images in the start of the word file (see http://sebsauvage.net/wiki/doku.php?id=word_document_generation).

mfernea’s picture

mfernea’s picture

The filesize is not saved correctly in the database. It has the initial filesize - 0. That's a problem for private files. Here are the updated patches.

Status: Needs review » Needs work

The last submitted patch, 63: views_data_export-permanent-files-1875456-63.patch, failed testing.

The last submitted patch, 63: views_data_export-permanent-files-1875456-63.patch, failed testing.

korlandi’s picture

Thank you so much, @nicolas.alipaz, for the instructions in #47. I have been using this successfully for months on a single batched export view that we use.

However, now I need to have the same code be replicated for 3 more reports I run weekly. They all have 5000+ rows of data, so they must be run via batched export. I have tried manipulating the code on my own, which was fruitless...but I'm not proficient in coding (other than basic excel and access macros). And I've tried >1 separate modules with the same code, but the end result is a no-go. Can someone please give me some hints as to how I can modify #47 to automatically save 4 different batched export views to a permanent folder on my server?


Devaraj johnson’s picture

Status: Needs work » Needs review

Any update in this patch

nithinkolekar’s picture

Just as #55 I an using "views json" with "views api mode" enabled for programmatically feeding d3js (its a complex setup though) but for simple data like flat data like CSV I couldn't make it work because drupal_add_http_header will pass the file breaking further processing.

Code from view_json can be borrowed to make it work including xml,rdf.