particle-iot-archived / particle-dev Goto Github PK
View Code? Open in Web Editor NEWParticle Dev package for Atom
Home Page: https://www.particle.io/dev
License: Apache License 2.0
Particle Dev package for Atom
Home Page: https://www.particle.io/dev
License: Apache License 2.0
This means bad password right? I'm using a password that works fine on the web ide.
One possible problem, I probably capitalized my e-mail address but I see most of the spark platform shows it lower case...
Trying to compile a project I copied over from the web IDE (works there). I get this every time (Mac version).
Window load time: 1056ms /Volumes/BayHD/Unsorted/mac/Spark%20Dev.app/Contents/Resources/app/static/index.js:39
Uncaught Error: EISDIR, read stream.js:94
when click to spark which is the on the top menu ( spark cloud) i cant write "@" for e-mail address ctrl+alt doesn't work.
Zip file of release 16 build only includes build for Mac. There are no files for Windows.
When you attempt to flash an offline core, the eventual error message is "socket hang up". This should be the error message coming from SparkJS as the same behavior on Spark-cli. Also the same behavior is observed using the Web IDE.
However, the user will have difficulty understanding till they realize the core is offline. A better option would be to know that the core is offline prior to flashing and output the error message accordingly ;)
The download for Max OS gets an .exe file.. surely this is a mistake...
1.) This might be in the documentation on which button does what
2.) Existing knowledge from using Web IDE
I propose to have a Tooltip to describe what each button does to guide along new users who do not read the documentation. It's also better for people who have used it but come back after a long time and naturally re-orientating them.
Example:
When you try to log in and you paste the password it is displayed clear text. When you type the password it is displayed correctly.
When you use copy paste it does not detect that there is something in the field.
Spark Dev based on Atom only allows editing off-line.
However, compilation is done still on the cloud.
Is there any plan to add compilation off-line?
Would be great to have a deb package for Ubuntu
I deliberately placed the .ino
file on my desktop and there is not spacing for the file paths as seen in the screenshot.
here's the error threw out:
Uncaught TypeError: Cannot read property '0' of undefined /Applications/Spark Dev.app/Contents/Resources/app/node_modules/spark-dev/lib/spark-dev.js:548(anonymous function) /Applications/Spark Dev.app/Contents/Resources/app/node_modules/spark-dev/lib/spark-dev.js:548tryCatchReject /Applications/Spark Dev.app/Contents/Resources/app/node_modules/spark-dev/node_modules/when/lib/mak…:830runContinuation1 /Applications/Spark Dev.app/Contents/Resources/app/node_modules/spark-dev/node_modules/when/lib/mak…:789Fulfilled.when /Applications/Spark Dev.app/Contents/Resources/app/node_modules/spark-dev/node_modules/when/lib/mak…:580Pending.run /Applications/Spark Dev.app/Contents/Resources/app/node_modules/spark-dev/node_modules/when/lib/mak…:471Scheduler._drain /Applications/Spark Dev.app/Contents/Resources/app/node_modules/spark-dev/node_modules/when/lib/Sch…:62Scheduler.drain /Applications/Spark Dev.app/Contents/Resources/app/node_modules/spark-dev/node_modules/when/lib/Sch…:27_tickCallback node.js:378
I would love to see the Spark DFU-UTIL package installed by default.
Also, I had DFU UTIL working in 0.0.13. Installed 0.0.14 just now and I'm getting this error (below).. if I delete the ~/.sparkide/ the dfu util is gone and I get other errors trying to search for it in the package manager. If I reinstall it with 0.0.13 and then open 0.0.14 I get this error below...
http://i.imgur.com/BLTcH5J.png
The following line causes "0 errors" in Version 0.0.16
#include "application."
Compiling with spark-cli generates the error "fatal error: application.: No such file or directory"
I grabbed the Mac OS X download from https://github.com/spark/spark-dev/releases. When I extracted it, there was a zip file within the zip along side the .app. Is this intended?
When just typing in a new program from scratch, then going to save it somewhere, if you save it in a directory full of other files it will try to compile all of those files. Perhaps a solution is to force the new file to be saved in an encapsulating directory with the same name as the file you are creating? Sound familiar? Yeah, the Arduino IDE does this. I guess it's a common problem.
I understand why arduino may do this, but I still never liked that you are forced to keep them the same name. That restriction wouldn't be absolutely necessary as we can currently compile one or more files from any named directory, but the question would be how to manage it? If it's just something you have to know, fine... but then how can we generate some helpful error messages or helpful suggestions while saving a "new" file or GUI cues.
Currently in the Web IDE, if you make changes and press Verify... it saves the files automatically and then compiles.
In Spark Dev, it will only try to read the current state of the files saved on your hard drive and compile those. This is a common "gotcha" with stand alone IDE's where when you make a change and recompile without saving first you don't see the change, and you spend a considerable amount of time trying to debug why the change you can see in the editor is not found in your device. It could be solved by automatically saving all open files when attempting to Verify or Flash. If the files are unsaved it should probably ask you to save them.
Related to this:
There is a feature that I had planned to ask for that is supported on the Arduino IDE, where you can make changes to a file and compile and flash the unsaved changes to your arduino. This is useful for quickly editing the examples without having to find a place to save them, or to copy paste some code from the web to try much in the same way. Sometimes you just want to try something without saving it and cluttering up your hard drive with one more set of files.
If the way unsaved files was handled was something more like it makes a copy of all files in your project and all open unsaved files to a temporary location whenever you press Verify or Flash, it would be able to compile unsaved or saved files transparently.
Moved to #45
Settings for all of this would make it powerful and fully in the user's control as well. For example you could have settings for Autosaving or not, Compiling unsaved changes or not, Ask to save unsaved files when compiling or flashing... or just save them automatically.
See issue with pictures here: https://community.spark.io/t/spark-dev-app-code-was-invalid/8289
essentially, whenever I try to compile a .ino file, the compiler says "App code is invalid" and doesn't allow me to compile. Any reasons as to why it won't compile? Spark core is powered and connected to the same network as the computer running spark dev.
When compiling with Spark-Dev we don't see any information of text+data used as we can see in Web IDE
Would it be possible to have this feature to see what the program use ?
Thank you very much for your help
Currently Spark Dev's project is a direcotry with only one entry point. Libraries on the other hand, contain their code in firmware/
and multiple examples in firmware/examples
. This could be resolved, by specifying "Target" which would point to file containing entry point and ignoring rest of them.
Had an issue where a user was compiling a project that contained a Spark Library... which had a lib/
subfolder full of source and header files.
http://community.spark.io/t/spark-dev-editor-problem/8009
It would be great to support this file structure natively so users do not have to manage two different sets of files. However some Spark Libraries contain multiple examples, so this may not be completely possible. At the lease we should ignore the config files (spark.json, readme.md, etc..), and support subdirectories.
If I open 'Cloud variables & functions', flash a new code with different variables there is no way to see the new variables.
This change should be detected automatically or there should be a refresh button.
Windows 7 Professional x64 (current with all updates)
Signed in to Spark Cloud successfully
Version 0.0.16
Receive error when trying to compile in cloud (the check button in the upper left of the editor):
Window load time: 6017ms -> index.js:39
Uncaught TypeError: Cannot read property '0' of undefined -> c:\Program Files (x86)\Spark Dev\resources\app\node_modules\spark-dev\node_modules\when\lib\decorators\unhandledRejection.js:100```
I tried this on my Macbook Pro last night and received the same error.
First some facts
, you can open a directory without opening a file or you can open a file and the directory that it's in will be displayed; along with all of the files in that directory.
If you open a directory with one file, say ./test1/test1.ino
. Go ahead and compile it and all is well.
Now open a file from another directory, say ./test2/test2.ino
. Notice that the open directory does not change, it's still showing test1
.
With test2.ino active, click the compile button and everything will seem to work fine. You can even flash it to your device... but what do you have? It will be test1.ino's binary, not test2.ino
Does it make sense to change the directory for whichever file is actively selected, and compile based on the open file? Personally I believe it's intuitive that the file I have displaying is the one that will be compiled. With this scheme you could have a bunch of different examples and open them all up, programming whichever one is active... as long as it's in it's own directory ;-)
You can open the ./test2/
directory specifically as stated in the facts
, however this is fairly cumbersome to remember to do.
Currently, only invoking via the drop down list works but the side bar button doesn't
When you hit the Wifi icon, this command pops out and that box highlighted in red has the typing icon and the text Enter SSID manually.
It came to me as...please start typing and the moment i did that, the IDE prompts an error
The blinking type icon is an Atom IDE thing and i'm not sure how easy it is to change the behavior but this is definitely going to cause some confusion
Originally by @technobly:
There is a feature that I had planned to ask for that is supported on the Arduino IDE, where you can make changes to a file and compile and flash the unsaved changes to your arduino. This is useful for quickly editing the examples without having to find a place to save them, or to copy paste some code from the web to try much in the same way. Sometimes you just want to try something without saving it and cluttering up your hard drive with one more set of files.
If the way unsaved files was handled was something more like it makes a copy of all files in your project and all open unsaved files to a temporary location whenever you press Verify or Flash, it would be able to compile unsaved or saved files transparently.
On MAC CPU utilization jumps 20% when I connect my Spark Core via USB. I open and connect to Serial monitor to view serial communication and this is when CPU utilization goes up.
I'm using OS X Yosemite.
So i exposed 1 variable and 1 function to test and make sure this feature is working as expected.
However, only the variable is listed but the function panel said "no functions registered"
Using Spark-cli, i am able to see the function listed through spark list
Be able to install Spark Dev as a package into Atom.
In the Web IDE, verifying code to check for errors does not download the binary file locally.
If we turn this on by default, every time a user wants to check for code error, a new binary file gets stored on their machine
I open a source file and I press the flashing button.
Uncaught TypeError: Cannot read property '0' of undefined
resources\app\node_modules\spark-dev\node_modules\when\lib\decorators\unhandledRejection.js:100
If you just assume your code is good, and go for the "I'm feeling Lucky" option and just hit the Flash button first before Verify, the process will fail. I'm not sure why though, because it does compile the code and download a binary... it just errors and says "invalid request" in red in the status bar after it compiles and downloads the binary.
This behavior of just hitting Flash first is supported in the Web IDE and is natural for quick changes that you know are good. There is not need to Verify first so I'm guessing there is just a bug here to work out.
The documentation link at the bottom of README.md points to http://docs.spark.io/dev results in 404 with this error:
404 Not Found
Code: NoSuchKey
Message: The specified key does not exist.
Key: dev
RequestId: 47981B8926FFE1E2
HostId: +lKYY5LFaikjUlKmp/mYKCNm9QO+Y75cbMiRLvN/rt8Bc/kgluYGUxHp1MBJEmrg```
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.