GithubHelp home page GithubHelp logo

facebookarchive / offline-conversion-file-uploader Goto Github PK

View Code? Open in Web Editor NEW
44.0 14.0 27.0 12.87 MB

A tool to upload offline conversions (in a CSV file) to Facebook.

Python 4.04% JavaScript 95.58% Shell 0.29% Dockerfile 0.09%

offline-conversion-file-uploader's Introduction

Offline Conversion Automated Uploader

Offline Conversion Automated Uploader is a command line tool that helps Facebook advertisers and marketing partners upload offline transactions and Custom Audience to the Facebook marketing API without building their own application for API Integration.

Why use this tool?

  • Building API integration will require engineering resources. Typically, an engineer without experience working with our Facebook marketing API will need about 3 weeks for development and testing.
  • In order to achieve the best possible match between your customers and Facebook users, the data needs to be normalized and hashed correctly. MDFU tool uses the libraries written by Facebook to ensure the best possible match rate.
  • For any issues with this tool, you will get support from Facebook.
  • This tool will be updated periodically to support more features.

How to use

  • Offline Events: Please see this guide for upload instructions
  • Custom Audiences: Please see this guide for upload instructions.

FAQ:

Q: There's another tool called MDFU, which one should I use?

We previously built a tool named MDFU with similar functionality. Use this tool whenever possible, the MDFU will be deprecated soon. This tool provides the following additional functionality:

  • It generates a report to help troubleshooting issues with your data file.
  • It is more robust, because it uses the battle-proven uploader core that is used in the web version.
  • It supports separating upload and preprocessing into two steps, so you can verify that data is hashed properly before sending to Facebook.
  • It supports a validate command which does a dry-run on some sample rows of your data file, so you have a chance to fix the issue before sending them to Facebook. (For offline events upload only)
  • It supports resuming, so you can upload same file again without causing duplications. (For offline events upload only)

Q: My company has firewall which will block API calls to Facebook. What are my options?

  • Whitelist Facebook IP's: Contact your security team to whitelist IP addresses returned by this command:

    whois -h whois.radb.net -- '-i origin AS32934' | grep ^route
    

    For more information, please refer to this guide which explains whitelisting for Facebook crawlers, but the same set of IP's are used for API servers.

  • Request your security team to create DMZ where outbound HTTP request is allowed.

Q: How this tool works?

This is a node.js application that will go through following steps to upload your offline conversions to Facebook's marketing API. Here is how the tool uploads data:

  1. Read configurations.
  2. Read input file in stream.
  3. For each line read, columns are normalized and hashed for upload.
  4. Collect normalized and hashed data into batches.
  5. POST each batch to the API endpoint.

License

Facebook Offline Conversion Automated Uploader is BSD-licensed. We also provide an additional patent grant.

offline-conversion-file-uploader's People

Contributors

carlosmartinezt avatar facebook-github-bot avatar kefei-cs19 avatar lucianobianchi avatar robchummel avatar supasate avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

offline-conversion-file-uploader's Issues

Gettting Error while running node lib/cli.js upload command

While running the command with a dirty file i am having a below issue.

Command Ran : node lib/cli.js upload --configFilePath offline-events/demo-offline-events/config-test-upload-dirty.json --inputFilePath dirtyfiles/abc.csv --accessToken $access_token --apiVersion $api_version --dataSetID $data_set_id

Error:
/lib/cli-upload.js:39
async function main() {
^^^^^^^^

SyntaxError: Unexpected token function
at exports.runInThisContext (vm.js:53:16)
at Module._compile (module.js:373:25)
at Object.Module._extensions..js (module.js:416:10)
at Module.load (module.js:343:32)
at Function.Module._load (module.js:300:12)
at Function.Module.runMain (module.js:441:10)
at startup (node.js:140:18)
at node.js:1043:3

uploading hashed audiences

Hi,

Is it possible to use your tool to upload hashed audiences? We cannot pull the original data from our database and we only have hashed data.

Where in the script can I disable to automatic hashing or is it something that is checked for automatically. (I tried uploading a hashed audience but I don't have a way to verify if it is correct).

You would be of great help, thanks!

API Failing as of July 1 2020

No changes our end however as of the July 1 we receive the below output (OAuthException).

It seems to create an infinite loop, returning the same error and process % does not increment.

image

invalid event_time format

Hi !

I am having trouble with the event_time field; all values are rejected as "invalid" by Facebook, although I use the same values of the sample files (https://www.facebook.com/business/help/606443329504150?id=2469097953376494)

For example, for the following input file:

email,phone,fn,ln,event_name,event_time,value,currency
[email protected],+598 12345678,NAME,SURNAME,Purchase,2019-03-26T18:28:00Z,208.44,USD

I get the following report:

================================================================================
Invalid Samples

Row 2
Rejected due to invalid or missing: event_time

      Column 0 (match_keys.email): [email protected]
      Column 1 (match_keys.phone): +598 12345678
      Column 2 (match_keys.fn): NAME
      Column 3 (match_keys.ln): SURNAME
      Column 4 (event_name): Purchase

[INVALID] Column 5 (event_time): 2019-03-26T18:28:00Z
Column 6 (value): 208.44
Column 7 (currency): USD

What may I be doing wrong?

Thanks in advance,
Antonio

Upload / Validate issue

Hope this is readable...unsure how to interpret - not a js developer and validation output also failing. any help appreciated...

error

Cannot upload a customer list larger than ~13000 rows

When I upload a customer list larger than ~13,000 rows, the app stuck at ~90% and keep showing the same number even after 15 minutes. Moreover, there are no errors showing in the log files even in logging = debug.

My original customer list is about 190,000 rows. I tried reducing the file size from the original until it passes at about 12,000 rows (I don't know the exact number, btw).

The following part is the output from cli command of uploading 190,000 customers.

Using option accessToken:  {{ACCESS_TOKEN}}
Using option batchSize:  2000
Using option customAudienceID:  {{CUATOM_AUDIENCE_ID}}
Using option customerFileSource:  USER_PROVIDED_ONLY
Using option delimiter:  ,
Using option format:  {}
Using option header:  true
Using option ignoreSampleErrors:  true
Using option inputFilePath:  customerlist.csv
Using option logging:  debug
Using option mapping:  {
  "0": "match_keys.email",
  "1": "match_keys.phone",
  "2": "match_keys.fn",
  "4": "match_keys.ln"
}
Using option removeUsers:  false
Using option reportOutputPath:  report.txt
Using option retentionDays:  0
Using option apiVersion:  v8.0
2020-10-18T20:12:17.230Z INFO Config and logger initialized.
2020-10-18T20:12:17.933Z DEBUG Batch [20, 59385): sending
2020-10-18T20:12:18.193Z DEBUG Batch [59385, 117671): sending
2020-10-18T20:12:18.350Z DEBUG Batch [117671, 175429): sending
2020-10-18T20:12:18.493Z DEBUG Batch [175429, 233365): sending
2020-10-18T20:12:18.635Z DEBUG Batch [233365, 290868): sending
2020-10-18T20:12:18.758Z DEBUG Batch [290868, 347394): sending
2020-10-18T20:12:18.914Z DEBUG Batch [347394, 403856): sending
2020-10-18T20:12:19.054Z DEBUG Batch [403856, 460465): sending
2020-10-18T20:12:19.191Z DEBUG Batch [460465, 517222): sending
2020-10-18T20:12:19.320Z DEBUG Batch [517222, 573329): sending
2020-10-18T20:12:19.438Z DEBUG Batch [573329, 629909): sending
2020-10-18T20:12:19.568Z DEBUG Batch [629909, 686616): sending
2020-10-18T20:12:21.131Z DEBUG Batch [59385, 117671): sent
2020-10-18T20:12:21.147Z DEBUG Batch [686616, 743759): sending
2020-10-18T20:12:21.148Z DEBUG Batch [117671, 175429): sent
2020-10-18T20:12:21.159Z DEBUG Batch [743759, 801513): sending
2020-10-18T20:12:21.160Z DEBUG Batch [20, 59385): sent
2020-10-18T20:12:21.172Z DEBUG Batch [801513, 858830): sending
2020-10-18T20:12:21.272Z DEBUG Batch [290868, 347394): sent
2020-10-18T20:12:21.284Z DEBUG Batch [858830, 916755): sending
2020-10-18T20:12:21.378Z DEBUG Batch [347394, 403856): sent
2020-10-18T20:12:21.390Z DEBUG Batch [916755, 974998): sending
2020-10-18T20:12:21.578Z DEBUG Batch [460465, 517222): sent
2020-10-18T20:12:21.592Z DEBUG Batch [974998, 1032452): sending
2020-10-18T20:12:21.957Z DEBUG Batch [517222, 573329): sent
2020-10-18T20:12:21.969Z DEBUG Batch [1032452, 1090330): sending
2020-10-18T20:12:22.066Z DEBUG Batch [573329, 629909): sent
2020-10-18T20:12:22.077Z DEBUG Batch [1090330, 1148892): sending
2020-10-18T20:12:22.180Z DEBUG Batch [175429, 233365): sent
2020-10-18T20:12:22.192Z DEBUG Batch [1148892, 1207483): sending
2020-10-18T20:12:22.193Z DEBUG Batch [403856, 460465): sent
2020-10-18T20:12:22.207Z DEBUG Batch [1207483, 1265952): sending
2020-10-18T20:12:22.316Z DEBUG Batch [233365, 290868): sent
2020-10-18T20:12:22.329Z DEBUG Batch [1265952, 1323909): sending
2020-10-18T20:12:22.435Z DEBUG Batch [629909, 686616): sent
2020-10-18T20:12:22.448Z DEBUG Batch [1323909, 1381104): sending
2020-10-18T20:12:22.753Z VERBOSE Progress: 12.40%
2020-10-18T20:12:23.490Z DEBUG Batch [858830, 916755): sent
2020-10-18T20:12:23.503Z DEBUG Batch [1381104, 1438825): sending
2020-10-18T20:12:23.613Z DEBUG Batch [801513, 858830): sent
2020-10-18T20:12:23.626Z DEBUG Batch [1438825, 1497712): sending
2020-10-18T20:12:23.739Z DEBUG Batch [686616, 743759): sent
2020-10-18T20:12:23.756Z DEBUG Batch [1497712, 1556550): sending
2020-10-18T20:12:24.283Z DEBUG Batch [1090330, 1148892): sent
2020-10-18T20:12:24.298Z DEBUG Batch [1556550, 1615469): sending
2020-10-18T20:12:24.409Z DEBUG Batch [743759, 801513): sent
2020-10-18T20:12:24.423Z DEBUG Batch [1615469, 1673870): sending
2020-10-18T20:12:24.425Z DEBUG Batch [1032452, 1090330): sent
2020-10-18T20:12:24.439Z DEBUG Batch [1673870, 1732482): sending
2020-10-18T20:12:24.676Z DEBUG Batch [916755, 974998): sent
2020-10-18T20:12:24.697Z DEBUG Batch [1732482, 1791526): sending
2020-10-18T20:12:24.697Z DEBUG Batch [1265952, 1323909): sent
2020-10-18T20:12:24.711Z DEBUG Batch [1791526, 1849786): sending
2020-10-18T20:12:24.824Z DEBUG Batch [1323909, 1381104): sent
2020-10-18T20:12:24.837Z DEBUG Batch [1849786, 1908746): sending
2020-10-18T20:12:24.838Z DEBUG Batch [974998, 1032452): sent
2020-10-18T20:12:24.854Z DEBUG Batch [1908746, 1967565): sending
2020-10-18T20:12:25.378Z DEBUG Batch [1207483, 1265952): sent
2020-10-18T20:12:25.391Z DEBUG Batch [1967565, 2026428): sending
2020-10-18T20:12:25.503Z DEBUG Batch [1148892, 1207483): sent
2020-10-18T20:12:25.518Z DEBUG Batch [2026428, 2085191): sending
2020-10-18T20:12:25.925Z DEBUG Batch [1438825, 1497712): sent
2020-10-18T20:12:25.938Z DEBUG Batch [2085191, 2144240): sending
2020-10-18T20:12:26.047Z DEBUG Batch [1381104, 1438825): sent
2020-10-18T20:12:26.059Z DEBUG Batch [2144240, 2202836): sending
2020-10-18T20:12:26.167Z DEBUG Batch [1497712, 1556550): sent
2020-10-18T20:12:26.180Z DEBUG Batch [2202836, 2261166): sending
2020-10-18T20:12:26.800Z DEBUG Batch [1556550, 1615469): sent
2020-10-18T20:12:26.812Z DEBUG Batch [2261166, 2319887): sending
2020-10-18T20:12:26.814Z DEBUG Batch [1615469, 1673870): sent
2020-10-18T20:12:26.827Z DEBUG Batch [2319887, 2378287): sending
2020-10-18T20:12:27.038Z DEBUG Batch [1732482, 1791526): sent
2020-10-18T20:12:27.052Z DEBUG Batch [2378287, 2437167): sending
2020-10-18T20:12:27.158Z DEBUG Batch [1791526, 1849786): sent
2020-10-18T20:12:27.170Z DEBUG Batch [2437167, 2495895): sending
2020-10-18T20:12:27.279Z DEBUG Batch [1908746, 1967565): sent
2020-10-18T20:12:27.291Z DEBUG Batch [2495895, 2553746): sending
2020-10-18T20:12:27.490Z DEBUG Batch [1673870, 1732482): sent
2020-10-18T20:12:27.503Z DEBUG Batch [2553746, 2612066): sending
2020-10-18T20:12:27.615Z DEBUG Batch [1849786, 1908746): sent
2020-10-18T20:12:27.628Z DEBUG Batch [2612066, 2670747): sending
2020-10-18T20:12:27.846Z VERBOSE Progress: 34.87%
2020-10-18T20:12:28.193Z DEBUG Batch [2026428, 2085191): sent
2020-10-18T20:12:28.205Z DEBUG Batch [2670747, 2729165): sending
2020-10-18T20:12:28.611Z DEBUG Batch [1967565, 2026428): sent
2020-10-18T20:12:28.625Z DEBUG Batch [2729165, 2787246): sending
2020-10-18T20:12:29.004Z DEBUG Batch [2085191, 2144240): sent
2020-10-18T20:12:29.018Z DEBUG Batch [2787246, 2845431): sending
2020-10-18T20:12:29.105Z DEBUG Batch [2261166, 2319887): sent
2020-10-18T20:12:29.120Z DEBUG Batch [2845431, 2903509): sending
2020-10-18T20:12:29.185Z DEBUG Batch [2378287, 2437167): sent
2020-10-18T20:12:29.199Z DEBUG Batch [2903509, 2961936): sending
2020-10-18T20:12:29.338Z DEBUG Batch [2144240, 2202836): sent
2020-10-18T20:12:29.353Z DEBUG Batch [2961936, 3020580): sending
2020-10-18T20:12:29.517Z DEBUG Batch [2553746, 2612066): sent
2020-10-18T20:12:29.534Z DEBUG Batch [3020580, 3079411): sending
2020-10-18T20:12:29.572Z DEBUG Batch [2437167, 2495895): sent
2020-10-18T20:12:29.587Z DEBUG Batch [3079411, 3138515): sending
2020-10-18T20:12:29.668Z DEBUG Batch [2202836, 2261166): sent
2020-10-18T20:12:29.683Z DEBUG Batch [3138515, 3196527): sending
2020-10-18T20:12:30.156Z DEBUG Batch [2670747, 2729165): sent
2020-10-18T20:12:30.170Z DEBUG Batch [3196527, 3255455): sending
2020-10-18T20:12:30.276Z DEBUG Batch [2319887, 2378287): sent
2020-10-18T20:12:30.292Z DEBUG Batch [3255455, 3313188): sending
2020-10-18T20:12:30.665Z DEBUG Batch [2787246, 2845431): sent
2020-10-18T20:12:30.679Z DEBUG Batch [3313188, 3371955): sending
2020-10-18T20:12:30.714Z DEBUG Batch [2612066, 2670747): sent
2020-10-18T20:12:30.729Z DEBUG Batch [3371955, 3430862): sending
2020-10-18T20:12:30.791Z DEBUG Batch [2729165, 2787246): sent
2020-10-18T20:12:30.807Z DEBUG Batch [3430862, 3489385): sending
2020-10-18T20:12:30.825Z DEBUG Batch [2845431, 2903509): sent
2020-10-18T20:12:30.839Z DEBUG Batch [3489385, 3547844): sending
2020-10-18T20:12:30.899Z DEBUG Batch [3020580, 3079411): sent
2020-10-18T20:12:30.912Z DEBUG Batch [3547844, 3605741): sending
2020-10-18T20:12:31.006Z DEBUG Batch [2495895, 2553746): sent
2020-10-18T20:12:31.021Z DEBUG Batch [3605741, 3663652): sending
2020-10-18T20:12:31.635Z DEBUG Batch [3196527, 3255455): sent
2020-10-18T20:12:31.647Z DEBUG Batch [3663652, 3720864): sending
2020-10-18T20:12:31.773Z DEBUG Batch [2903509, 2961936): sent
2020-10-18T20:12:31.785Z DEBUG Batch [3720864, 3778463): sending
2020-10-18T20:12:31.854Z DEBUG Batch [2961936, 3020580): sent
2020-10-18T20:12:31.868Z DEBUG Batch [3778463, 3835769): sending
2020-10-18T20:12:32.227Z DEBUG Batch [3430862, 3489385): sent
2020-10-18T20:12:32.241Z DEBUG Batch [3835769, 3892633): sending
2020-10-18T20:12:32.412Z DEBUG Batch [3079411, 3138515): sent
2020-10-18T20:12:32.425Z DEBUG Batch [3892633, 3948873): sending
2020-10-18T20:12:32.472Z DEBUG Batch [3138515, 3196527): sent
2020-10-18T20:12:32.485Z DEBUG Batch [3948873, 4004944): sending
2020-10-18T20:12:32.544Z DEBUG Batch [3489385, 3547844): sent
2020-10-18T20:12:32.560Z DEBUG Batch [4004944, 4061459): sending
2020-10-18T20:12:32.684Z DEBUG Batch [3255455, 3313188): sent
2020-10-18T20:12:32.697Z DEBUG Batch [4061459, 4118527): sending
2020-10-18T20:12:32.846Z VERBOSE Progress: 60.46%
2020-10-18T20:12:33.235Z DEBUG Batch [3371955, 3430862): sent
2020-10-18T20:12:33.248Z DEBUG Batch [4118527, 4174290): sending
2020-10-18T20:12:33.413Z DEBUG Batch [3313188, 3371955): sent
2020-10-18T20:12:33.428Z DEBUG Batch [4174290, 4230803): sending
2020-10-18T20:12:33.514Z DEBUG Batch [3778463, 3835769): sent
2020-10-18T20:12:33.527Z DEBUG Batch [4230803, 4285677): sending
2020-10-18T20:12:33.528Z DEBUG Batch [3605741, 3663652): sent
2020-10-18T20:12:33.539Z DEBUG Batch [4285677, 4341670): sending
2020-10-18T20:12:33.638Z DEBUG Batch [3547844, 3605741): sent
2020-10-18T20:12:33.652Z DEBUG Batch [4341670, 4397836): sending
2020-10-18T20:12:34.276Z DEBUG Batch [3892633, 3948873): sent
2020-10-18T20:12:34.288Z DEBUG Batch [4397836, 4453294): sending
2020-10-18T20:12:34.388Z DEBUG Batch [3663652, 3720864): sent
2020-10-18T20:12:34.402Z DEBUG Batch [4453294, 4508957): sending
2020-10-18T20:12:34.436Z DEBUG Batch [3720864, 3778463): sent
2020-10-18T20:12:34.452Z DEBUG Batch [4508957, 4564428): sending
2020-10-18T20:12:34.489Z DEBUG Batch [4004944, 4061459): sent
2020-10-18T20:12:34.504Z DEBUG Batch [4564428, 4619888): sending
2020-10-18T20:12:34.658Z DEBUG Batch [4061459, 4118527): sent
2020-10-18T20:12:34.673Z DEBUG Batch [4619888, 4675347): sending
2020-10-18T20:12:34.821Z DEBUG Batch [4118527, 4174290): sent
2020-10-18T20:12:34.836Z DEBUG Batch [4675347, 4730815): sending
2020-10-18T20:12:34.898Z DEBUG Batch [3835769, 3892633): sent
2020-10-18T20:12:34.913Z DEBUG Batch [4730815, 4786164): sending
2020-10-18T20:12:34.914Z DEBUG Batch [4174290, 4230803): sent
2020-10-18T20:12:34.927Z DEBUG Batch [4786164, 4841169): sending
2020-10-18T20:12:34.960Z DEBUG Batch [3948873, 4004944): sent
2020-10-18T20:12:34.974Z DEBUG Batch [4841169, 4896656): sending
2020-10-18T20:12:35.936Z DEBUG Batch [4397836, 4453294): sent
2020-10-18T20:12:35.952Z DEBUG Batch [4896656, 4952099): sending
2020-10-18T20:12:36.021Z DEBUG Batch [4341670, 4397836): sent
2020-10-18T20:12:36.034Z DEBUG Batch [4952099, 5007666): sending
2020-10-18T20:12:36.087Z DEBUG Batch [4230803, 4285677): sent
2020-10-18T20:12:36.101Z DEBUG Batch [5007666, 5063548): sending
2020-10-18T20:12:36.102Z DEBUG Batch [4285677, 4341670): sent
2020-10-18T20:12:36.116Z DEBUG Batch [5063548, 5119064): sending
2020-10-18T20:12:36.702Z DEBUG Batch [4786164, 4841169): sent
2020-10-18T20:12:36.716Z DEBUG Batch [5119064, 5175783): sending
2020-10-18T20:12:36.803Z DEBUG Batch [4564428, 4619888): sent
2020-10-18T20:12:36.816Z DEBUG Batch [5175783, 5232268): sending
2020-10-18T20:12:36.952Z DEBUG Batch [4453294, 4508957): sent
2020-10-18T20:12:36.967Z DEBUG Batch [5232268, 5288391): sending
2020-10-18T20:12:37.021Z DEBUG Batch [4508957, 4564428): sent
2020-10-18T20:12:37.107Z DEBUG Batch [4675347, 4730815): sent
2020-10-18T20:12:37.200Z DEBUG Batch [4730815, 4786164): sent
2020-10-18T20:12:37.325Z DEBUG Batch [4841169, 4896656): sent
2020-10-18T20:12:37.518Z DEBUG Batch [4619888, 4675347): sent
2020-10-18T20:12:37.589Z DEBUG Batch [4896656, 4952099): sent
2020-10-18T20:12:37.769Z DEBUG Batch [5007666, 5063548): sent
2020-10-18T20:12:37.820Z DEBUG Batch [5063548, 5119064): sent
2020-10-18T20:12:37.848Z VERBOSE Progress: 90.17%
2020-10-18T20:12:38.598Z DEBUG Batch [4952099, 5007666): sent
2020-10-18T20:12:39.110Z DEBUG Batch [5119064, 5175783): sent
2020-10-18T20:12:39.336Z DEBUG Batch [5232268, 5288391): sent
2020-10-18T20:12:39.833Z DEBUG Batch [5175783, 5232268): sent
2020-10-18T20:12:42.849Z VERBOSE Progress: 94.27%
2020-10-18T20:12:47.849Z VERBOSE Progress: 94.27%
2020-10-18T20:12:52.850Z VERBOSE Progress: 94.27%
## the above line will keep continuing even after 15 mins

Error: Wrong Currency on Business Manager

Hi,

I'm using this tool to upload Offline Events to a offline events set. I set the currency code in the csv file to ARS but in Business Manager it is displayed as US (It makes the conversion from ARS to US).

Could you help me to find out where is the error?

Thanks.
Regards.

Failed to Initialize Upload Tag / Error 275

Hi -

Since Tuesday (April 2, 2019) I've been getting an error 275 "Failed to initialize upload tag - Cannot determine the target object for this request." See attached screenshot.

It only happens with the "upload" command. "Validate" works fine. From what I think I see in the code, though, validate and upload both point to the same place? Which suggests that if one works the other ought to, too? As in it's not a key issue on my side.

Things I've tried:
I thought it might be an access token issue, so I refreshed that; I got the latest version of this tool; I upgraded all API calls to the latest version; I made sure the dataSetID was correct; I tried specifying the dataSetID both in a config file and in the command line. Still nothing.

There's still the very real possibility that this is just me messing something up, but it's starting to seem like it might be something weird/buggy. Does the sudden break suggest anything? Was there a version change? Does graph now require an ad account ID as well as a dataset ID?

I'd appreciate any help!
Andrew

upload tag error

Install Errors

Newb to Git & Node...apologies in advance

Win 10 64bit

Followed the below instructions from cmd console in admin mode:

Make sure git and node are installed and are up-to-date on your machine, then run:

git clone https://github.com/facebookincubator/offline-conversion-file-uploader
npm install
npm run compile

Command Screen output:

D:\Dev\FB-Offline-Inc>git clone https://github.com/facebookincubator/offline-conversion-file-uploader
Cloning into 'offline-conversion-file-uploader'...
remote: Counting objects: 374, done.
remote: Compressing objects: 100% (22/22), done.
remote: Total 374 (delta 4), reused 22 (delta 1), pack-reused 348R
/s
Receiving objects: 100% (374/374), 12.64 MiB | 761.00 KiB/s, done.
Resolving deltas: 100% (205/205), done.

D:\Dev\FB-Offline-Inc>npm install
npm WARN saveError ENOENT: no such file or directory, open 'D:\Dev\FB-Offline-Inc\package.json'
npm notice created a lockfile as package-lock.json. You should commit this file.
npm WARN enoent ENOENT: no such file or directory, open 'D:\Dev\FB-Offline-Inc\package.json'
npm WARN FB-Offline-Inc No description
npm WARN FB-Offline-Inc No repository field.
npm WARN FB-Offline-Inc No README data
npm WARN FB-Offline-Inc No license field.

up to date in 1.123s
found 0 vulnerabilities

D:\Dev\FB-Offline-Inc>npm run compile
npm ERR! path D:\Dev\FB-Offline-Inc\package.json
npm ERR! code ENOENT
npm ERR! errno -4058
npm ERR! syscall open
npm ERR! enoent ENOENT: no such file or directory, open 'D:\Dev\FB-Offline-Inc\package.json'
npm ERR! enoent This is related to npm not being able to find a file.
npm ERR! enoent

npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\charl\AppData\Roaming\npm-cache_logs\2018-09-07T03_19_46_992Z-debug.log

debug.log output

0 info it worked if it ends with ok
1 verbose cli [ 'C:\Program Files\nodejs\node.exe',
1 verbose cli 'C:\Program Files\nodejs\node_modules\npm\bin\npm-cli.js',
1 verbose cli 'run',
1 verbose cli 'compile' ]
2 info using [email protected]
3 info using [email protected]
4 verbose stack Error: ENOENT: no such file or directory, open 'D:\Dev\FB-Offline-Inc\package.json'
5 verbose cwd D:\Dev\FB-Offline-Inc
6 verbose Windows_NT 10.0.17134
7 verbose argv "C:\Program Files\nodejs\node.exe" "C:\Program Files\nodejs\node_modules\npm\bin\npm-cli.js" "run" "compile"
8 verbose node v10.9.0
9 verbose npm v6.2.0
10 error path D:\Dev\FB-Offline-Inc\package.json
11 error code ENOENT
12 error errno -4058
13 error syscall open
14 error enoent ENOENT: no such file or directory, open 'D:\Dev\FB-Offline-Inc\package.json'
15 error enoent This is related to npm not being able to find a file.
16 verbose exit [ -4058, true ]

Unhandled promise rejections are deprecated

Hi - From last 2 days on wards getting the following error. Nothing was changed from my end.
Even i tried to process the same file which was processed successful earlier but still getting same error. I hope this screenshot is visible.

I downloaded the code freshly but getting the same error.

Capture

DatasetID - Where do I get that value?

Hi!

I am having trouble uploading offline conversions; specifically with the DatasetID parameter.

Can you help me understand where this value comes from?

{"message":"Unsupported get request. Object with ID '20210721' does not exist, cannot be loaded due to missing permissions, or does not support this operation. Please read the Graph API documentation at https://developers.facebook.com/docs/graph-api","type":"GraphMethodException","code":100,"fbtrace_id":"Ajc4JUR51As_9ZuQtXNjJPt"}

Best,
Antonio

Getting error Error: ENOENT: no such file or directory, open

I am trying to execute the sample from the guide. I followed all the instructions setting up environment variables for accessToken and data_set_id. I provided the test json file as described in the guide. After running below command,

node lib\cli.js upload --configFilePath offline-events\demo-offline-events\config-test-upload.json --inputFilePath offline-events\demo-offline-events\test-event-files\test-events-2019-06-24T135313.357Z.csv --accessToken %access_token% --dataSetID %data_set_id%--apiVersion v3.3

I am getting error,

Using option accessToken: ''
Using option batchSize: 2000
Using option customTypeInfo: {}
Using option dataSetID: '<data_set_id>'
Using option delimiter: ,
Using option format: {
"event_time": {
"timeFormat": "unix_time"
}
}
Using option header: false
Using option inputFilePath: offline-events\demo-offline-events\test-event-files\test-events-2019-06-24T135313.357Z.csv
Using option logging: verbose
Using option mapping: {
"0": "match_keys.email",
"1": "match_keys.phone",
"2": "event_time",
"3": "event_name",
"4": "value",
"5": "currency"
}
Using option presetValues: {}
Using option reportOutputPath: report.txt
Option uploadTag, uploadTagPrefix and uploadID is deprecated. Please make sure that your file name contains date stamp, and leave the parameters unset. An uploadTag will be generated automaticially based on the name and size of input file. With skipRowsAlreadyUploaded option, the uploader can upload reliably and make sure each row is uploaded once and only once.
Using option uploadTagPrefix: Legacy upload tag prefix
Using option apiVersion: v3.3
2019-06-24T14:07:52.968Z INFO Config and logger initialized.
events.js:167
throw er; // Unhandled 'error' event
^

Error: ENOENT: no such file or directory, open '\offline-conversion-file-uploader\upload-test-events-2019-06-24T135313.357Z.csv-2019-06-24T14:07:52.956Z-txt.log'
Emitted 'error' event at:
at WriteStream. (
\offline-conversion-file-uploader\node_modules\winston\lib\winston\transports\file.js:491:16)
at WriteStream.emit (events.js:182:13)
at lazyFs.open (internal/fs/streams.js:273:12)
at FSReqWrap.oncomplete (fs.js:141:20)

Network Error

receiving the following error during upload:

photo_2018-11-14_11-48-04

There are no firewall restrictions on outgoing traffic...

Thanks in advance for any assistance.

Can't run compile

Hi folks,

I'm trying to install this app but terminal gives me error when I run npm run compile

error message is

npm ERR! missing script: compile

npm ERR! A complete log of this run can be found in:
npm ERR! /Users/xx/.npm/_logs/2021-05-12T19_56_37_240Z-debug.log

I checked package.json file and I do find compile script but somehow it just keeps giving me this error. Could anyone please help?

Thank you!

special_ad_category compliance

Looking for information on how we cater for this new compliance requirement:
https://developers.facebook.com/docs/marketing-api/special-ad-category

Including as an option field returns an invalid message EG:
config entry:
"special_ad_category":"NONE",
response:
Ignoring unsupported option special_ad_category. Please double check spelling and case.

As does including as a mapped field EG:
...
"19": "item_number",
"20": "special_ad_category"
}
2019-09-26T03:04:09.451Z ERROR Invalid config: Error: Unknown mapping 'special_ad_category' on column 20.

Automate Scheduling

Did not find the command to schedule this in cron for this tool, was there in previous version but not sure if it works on this one. (0 2 * * * cd && ./marketing-data-file-uploader offline-conversions —accessToken <Your_Access_token> —inputFilePath /offline_conversions_$(date +%F).csv —uploadTag offline_conversions_$(date +%F)
) Can provide me the command for this tool ?

The events provided to upload to an Offline Conversion dataset should be in the form of an array

Hi - Getting the following error. Nothing was changed from my end either in code or data.

Error Message:

"event_index":14,"message":"Limited Data Use Selection Required: From July 1 to August 1, you must indicate whether usage should be limited for data sent via this API
to be processed. You can learn how to implement Limited Data Use here - https://www.facebook.com/business/help/1151133471911882","type":"CodedException","code":2804030}},"error_subcode":1815356,"is_transient":false,"error_user_title":"Array of offline conversion events is invalid.","error_user_msg":"The events provided to upload to an Offline Conversion dataset should be in the form of an array. And each event should conform to specifications based on the type of the event.","fbtrace_id":"ASEDB9_h4AcaCCyvrUfiSxP"}}
2020-07-29T13:55:30.590Z WARN Batch [2525714, 2840337): API error: Invalid parameter

FEATURE: appsecret_proof arugment

Can an additional parameter be added to this code so that an appsecret_proof argument can be supplied when uploading data?

All my API calls are failing due to the Require App Secret toggle being turned on in Facebook developer. This is the error message I receive:

{"error":{"message":"API calls from the server require an appsecret_proof argument","type":"GraphMethodException","code":100,"fbtrace_id":"<>"}}

When I tested the upload with this toggle turned off, it worked smoothly.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.