Record data stream from curl to text file

curl data
curl options request
curl verbose
curl -k
curl data-raw
curl command mac
curl output
curl cmd

I'm trying to record some data that streams in from the command:


That curl command will listen to particle's cloud and print out any messages I send or receive between my particle devices to the terminal. I want to save the output to a text file.

I've tried:

  • curl url > output.txt
  • curl url >> output.txt
  • curl url -o output.txt

Nothing has worked, on a Mac, or in linux. The file gets created, (or truncated if applicable), but nothing ever gets written to the file, even when a duplicate terminal window is printing output from the curl command.

My only guess is that the curl command continues indefinitely until I ctrl+C out of it, and maybe quitting that way prevents output from being recorded to the file? How can I record output to a file as the data comes in?

I've been working with cURL for about a decade, and I don't know any way to get it to stream data like that from the command line. There is a CURLOPT_WRITEFUNCTION that you can use in the C API, but I doubt you want to be writing your own C modules. I scanned the man page and couldn't find any similar option available in the CLI. I don't think that cURL command line offers an option to do what you need. As you suspect, it waits until the entire page is received before output (which is often necessary with pages being compressed).

Uploading file data using cURL, endpoints · Retrieving app records using the bulk API · Uploading file data using cURL If you want to upload data using the bulk API, you can use a cURL request to upload data from a .json or .csv file located on your machine. User" --header "Content-Type: text/csv" --request POST --data-binary "@apitest.csv"  This option causes curl to save the retrieved file with the same name that the file has on the remote server. The -n 1 option tells xargs to treat each line of the text file as a single parameter. When you run the command, you’ll see multiple downloads start and finish, one after the other.

I came across this question also hoping to log a particle event stream from curl. Since curl can't do that, I wrote a quick node script that does what I needed. Putting it here in case it's helpful for anyone else. Note that the "EventListener" needs to match the event string that your particle is emitting (in this case 'currentTime'):

#!/usr/bin/env node

var EventSource = require('eventsource');
var fs = require('fs');

var URL = '';

var es = new EventSource(URL);
es.addEventListener('currentTime', function (e) {

output - Save hourly file from cURL response, curl -s -u twitterusername:twitterpassword​sample.json -o "somefile $(date + format).txt". Where, format can  curl is a good tool to transfer data from or to a server especially making requests, testing requests and APIs from the command line. This tutorial shows how to upload files with curl command line on Linux or UNIX-like system.

From ...

-N, --no-buffer

Disables the buffering of the output stream. In normal work situations, curl will use a standard buffered output stream that will have the effect that it will output the data in chunks, not necessarily exactly when the data arrives. Using this option will disable that buffering.

I use this option to process data while it arrives, so I don't have to wait until the request completes.

Downloads, curl -o file.txt The stdout stream is for the data while stderr is metadata and errors, etc., that are not data. Users then typically record their browser sessions with the browser's networking tools and then  This recipe loads POST data from a file called data.txt. Notice the extra @ symbol before the filename. That's how you tell curl that data.txt is a file and not just a string that should go in the POST body.

How To Use, The command line arguments found in the text file will be used as if they were option will activate the cookie engine that makes curl record and use cookies. as arbitrary binary data by the server then set the content-type to octet-stream: -H​  cURL from a file input I am trying to post to a rest service with the url saved in a file. I've tried a couple of ways, but when I run manually I don't get a response, when I post in Postman I get the expected return.

HTTP POST, POST is the HTTP method that was invented to send data to a receiving web application, and curl --data-urlencode user@contents.txt A third option is, of course, to use a network capture tool such as Wireshark to check  Download images from text file using curl: Sam71: Programming: 4: 04-19-2011 04:59 AM: curl: bloodsugar: Slackware: 7: 08-17-2009 10:09 AM: cURL: Server has many IPs, how would I make a cURL script use those IPs to send data? guest: Programming: 0: 04-11-2009 11:42 AM: How to parse text file to a set text column width and output to new text

Tutorial, HTTP is the protocol used to fetch data from web servers. It is a curl --trace-​ascii debugdump.txt this of course requires that someone put a program or script on the server end that knows how to receive a HTTP PUT stream. Record cookies with curl by using the --dump-header (-D) option like: You can use file_get_contents as Petr says, but you need to activate allow_url_fopen in your php.ini and perhaps your hosting do not allow you to change this. If you prefer to use CURL instead of file_get_contents , try this code:

  • I was going to recommend telnet 80 and then GET /v1/devices/events?access_token=blahblahtoken Host: But then I noticed it's https, so that is not going to work.
  • Curl is new to me, but I've used C enough to be comfortable with writing my own module if it gets the job done. I'll give it a go when I get home, are there any particular examples you'd recommend looking at to get familiar with the CURLOPT_WRITEFUNCTION?